E-MAIL THIS LINK
To: 

Systems Software Research is Irrelevant
Rob Pike, Bell Labs
Feb 21, 2000

1 A Polemic
This talk is a polemic that distills the pessimistic side of my feelings about systems research these days. I won’t talk much about the optimistic side, since lots of others can do that for me; everyone’s excited about the computer industry. I may therefore present a picture somewhat darker than reality. However, I think the situation is genuinely bad and requires action.

2 Thesis
Systems software research has become a sideline to the excitement in the computing industry. When did you last see an exciting noncommercial demo? Ironically, at a time when computing is almost the definition of innovation, research in both software and hardware at universities and much of industry is becoming insular, ossified, and irrelevant. There are many reasons, some avoidable, some endemic. There may be ways to improve the situation, but they will require a communitywide effort.

3 Definitions...

4 A Field in Decline
"Who needs new operating systems, anyway?" you ask. Maybe no one, but then that supports my thesis.

"But now there are lots of papers in file systems, performance, security, web caching, etc.," you say. Yes, but is anyone outside the research field paying attention?
This is the central part of his thesis: that there are really only two operating system choices in the world, *nix and Windows. But he's ignoring the Jurassic period of computing, when the "home" or business machine was young: CP/M, various menu-driven systems, the venerable TRS-80, Geos, the Amiga, the Atari ST... That was a pretty Darwinian period, where an OS could pop up on Monday and be gone by Tuesday afternoon, as I'll discuss below...
5 Systems Research’s Contribution to the Boom...
Hardware has changed dramatically; software is stagnant.
>
199020002006
Hardware
33 MHz Mips R3000
32 megabytes of RAM
10 Mbs Ethernet

600 MHz Alpha or Pentium III
512 megabytes of RAM
100 Mbs Ethernet

3.2 GHz AMD 64
1-2 GB of RAM
55 Mbs Wireless Ethernet
Software
Unix
X Windows
Emacs
TCP/IP

Unix
X Windows
Emacs
TCP/IP
Netscape


Unix/Linux
Windows
X Windows
Emacs + Others
TCP/IP
Opera
USB
Good progression, though there are a few things missing, like IPX/SPX...
6 Where is the Innovation?
Microsoft, mostly. Exercise: Compare 1990 Microsoft software with 2000.
That'd be Windows 3.0, or maybe even 2.0, compared to Windows 2000.
If you claim that’s not innovation, but copying, I reply that Java is to C++ as Windows is to the Macintosh: an industrial response to an interesting but technically flawed piece of systems software.
Both are evolutions from base systems that won their particular competitions.
If systems research was relevant, we’d see new operating systems and new languages making inroads into the industry, the way we did in the ’70s and ’80s. Instead, we see a thriving software industry that largely ignores research, and a research community that writes papers rather than software.
And here's the basic flaw in his reasoning: he's looking at the past and expecting the future to be a continuation of it. Looking at it from a different standpoint, the base form of Unix won its competition early on: a tiny kernel and the basic file system layout. Everything after that is elaboration. It was a thing of beauty, simple and intuitive. It's just like Bridge 1.0, invented by Og and Zug in 9276 B.C. Everything since has been refinement, and the competing approaches are best forgotten.
7 Linux
Innovation? New? No, it’s just another copy of the same old stuff. OLD stuff. Compare program development on Linux with Microsoft Visual Studio or one of the IBM Java/web toolkits.
In 2000, Linux was still in its infancy. The early efforts were geared much more toward the academics and the hacker community. In 2006 we're looking at products that are approaching full maturity, ready for prime time.
Linux’s success may indeed be the single strongest argument for my thesis: The excitement generated by a clone of a decadesold operating system demonstrates the void that the systems software research community has failed to fill. Besides, Linux’s cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure.
If you regard the basic Unix approach as a building block, Linux is a good development, more than just "clever." I can remember using SCO Xenix and QNX on early PC boxes, and they were actually more mature (for the time) that Linux was the first time I looked at it. Both were proprietary, and both fell by the wayside because the Linux development model beat them out. MS DOS beat out its competition in a similar manner, running not just on IBM PCs, but also on 8086 clones. The Mac was pretty, in many ways more technologically sophisticated, but it didn't compete — PCs were cheaper, the software was cheaper, and users weren't restricted to the Apple brand. Both the development and marketing models are tied to the system.
8 What is Systems Research these days?
Web caches, web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even userlevel applications.
Once the Wright brothers figured out how to make airplanes, there wasn't a need for Boeing to reinvent them. The same applies to software. Despite the bells and whistles that are hung on new versions of "Office" products, they remain basically the same, with word processing, spreadsheet, presentation, and perhaps database components. The spreadsheet, the original "killer app," didn't exist before Visicalc, and hasn't changed a whole lot since Excel 3.0. "Killer apps" are pretty few and far between.
Mostly, though, it’s just a lot of measurement; a misinterpretation and misapplication of the scientific method. Too much phenomenology: invention has been replaced by observation. Today we see papers comparing interrupt latency on Linux vs. Windows. They may be interesting, they may even be relevant, but they aren’t research.
80 percent of Academe is like 80 percent of everything else, made up of hacks and place holders. Another 10 percent is actually destructive, which leaves 10 percent to do the actual thinking.
In a misguided attempt to seem scientific, there’s too much measurement: performance minutiae and bad charts. By contrast, a new language or OS can make the machine feel different, give excitement, novelty. But today that’s done by a cool web site or a higher CPU clock rate or some cute little device that should be a computer but isn’t. The art is gone. But art is not science, and that’s part of the point. Systems research cannot be just science; there must be engineering, design, and art.

9 What Happened?
A lot of things:...

10 PC
Hardware became cheap, and cheap hardware became good. Eventually, if it didn’t run on a PC, it didn’t matter because the average, mean, median, and mode computer was a PC. Even into the 1980s, much systems work revolved around new architectures (RISC, iAPX/432, Lisp Machines). No more. A major source of interesting problems and, perhaps, interesting solutions is gone.
They went prior to the development of the graphical browser, which would have allowed them to coexist, though in niche markets, kind of like where Solaris is now. The PC's advantage, like that of Linux, by the way, was its open architecture. All advantages bring with them disadvantages, but the architecture's been changing with new developments — albeit at a fairly stately rate due to the necessity of having (most) everyone agree on standards.
Much systems work also revolved around making stuff work across architectures: portability. But when hardware’s all the same, it’s a nonissue... And that’s just the PC as hardware; as software, it’s the same sort of story.

11 Microsoft
Enough has been said about this topic. (Although people will continue to say lots more.) Microsoft is an easy target, but it’s a scapegoat, not the real source of difficulty. Details to follow.
Microsoft makes an easy target for many because it tries to take a lowest common denominator approach, while gouging as much cash from its adoring public as the traffic will bear. It beat out its competition by not being proprietary: recall using Samna Write/Ami Pro and Word Perfect on Windows 3.0/3.1, and Lotus 1-2-3, and Netscape, and any number of other products. Because of its corporate strategy, Microsoft either incorporated the best of the best into Windows (remember when Netscape used to cost money?) or tried to do a better job with its own product at the same price (Microsoft Office).
12 Web
The web happened in the early 1990s and it surprised the computer science community as much as the commercial one. It then came to dominate much of the discussion, but not to much effect. Business controls it. (The web came from physicists and prospered in industry.)
There's a Year 2000 statement. Six years later, with the development of dynamic web content, business is still on the web, with advertising and order entry systems and such, but so is everyone else, to include millions of bloggers using pretty easy-to-use but very sophisticated software.
Bruce Lindsay of IBM: HDLC C HTTP/HTML; 3270s have been replaced by web browsers. (Compare with Visicalc and PC.)
Heh heh. I haven't seen a 3270 emulator in years.
Research has contributed little, despite a huge flow of papers on caches, proxies, server architectures, etc.
But there are lots of refinements on the basic model. The refinements make the software more usable, reaching a bigger audience, which provides its own level of feedback to make the software still more usable... From the management standpoint, we've gone from the base development cycle to the cash cow period, and I'm pretty sure we're still at an early stage in that.
13 Standards
To be a viable computer system, one must honor a huge list of
large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you don’t honor the standards you’re marginalized. Estimate that 90-95% of the work in Plan X was directly or indirectly to honor externally imposed standards. At another level, instruction architectures, buses, etc. have the same influence. With so much externally imposed structure, there’s little slop left for novelty.
All those standards are building blocks, and each and every one of them could be superceded by a better approach. If you can come up with a better approach, by all means do so; until you do, these work, and they've been refined to the point where they work very well.
Plus, commercial companies that ‘own’ standards, e.g. Microsoft, Cisco, deliberately make standards hard to comply with, to frustrate competition. Academia is a casualty.
That's maybe a legitimate gripe, though my heart doesn't bleed for academia. There's a balance between standards and confinement. And note that six years after the presentation was written Cisco actually has competition in its arena, despite being guardian of the standards.
14 Orthodoxy
Today’s graduating PhDs use Unix, X, Emacs, and Tex.
Users, I'd point out, don't...
That’s their world. It’s often the only computing world they’ve ever used for technical work.
This has slowly changed over the past six years, though I'm not sure academia realizes yet that it's behind the times compared to the world at large, a situation reversed from the way things were 20 years ago...
Twenty years ago, a student would have been exposed to a wide variety of operating systems, all with good and bad points. New employees in our lab now bring their world with them, or expect it to be there when they arrive. That’s reasonable, but there was a time when joining a new lab was a chance to explore new ways of working.
Now you're likely to find a slightly different set of building blocks. I read somewhere that systems that belong to suit-and-tie guys are almost always Windows/IIS, and systems that belong to the sweatshirt and jeans set are almost always Linux/Apache. But if you're going to be in the wonderful world of IT, you've got to get used to the idea of going with what the customer uses. Sometimes you can show him/her/it a better way, and sometimes you make a lot of money fixing mistakes made 10 years ago. That's actually the fun of it, for some of us.
Narrowness of experience leads to narrowness of imagination. The situation with languages is a little better—many curricula include exposure to functional languages, etc. —but there is also a language orthodoxy: C++ and Java.
They're pretty common in the want ads, too, so don't discount them, even though on the job you'll likely end up using Visual Basic or PHP or C.
In science, we reserve our highest honors for those who prove we were wrong. But in computer science...

15 Change of scale
With so many external constraints, and so many things already done, much of the interesting work requires effort on a large scale. Many person-years are required to write a modern, realistic system. That is beyond the scope of most university departments. Also, the time scale is long: from design to final version can be five years. Again, that’s beyond the scope of most grad students. This means that industry tends to do the big, defining projects—operating systems, infrastructure, etc.— and small research groups must find smaller things to work on.

Three trends result:
1. Don’t build, measure. (Phenomenology, not new
things.)
2. Don’t go for breadth, go for depth. (Microspecialization, not systems work.)
3. Take an existing thing and tweak it.
I believe this is the main explanation of the SOSP curve.

16 Unix
New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture—and some do—the first thing to build is the Unix emulation layer. How can operating systems research be relevant when the resulting operating systems are all indistinguishable? There was a claim in the late 1970s and early 1980s that Unix had killed operating systems research because no one would try anything else. At the time, I didn’t believe it. Today, I grudgingly accept that the claim may be true (Microsoft notwithstanding).

A victim of its own success: portability led to ubiquity. That meant architecture didn’t matter, so now there’s only one. Linux is the hot new thing... but it’s just another Unix.
I go back to my comments on competition. Unix was the best system at the time, and its base system probably still is, not due to its features but due to its simplicity. Likewise the C language displaced, for all practical purposes, Pascal and ALGOL and ADA and a host of other languages, not because of its "rich programming environment" but because of its simplicity. You can take a core C implementation, without any libraries (to include the very basics like stdio.h) and build an entire new implementation. Or you can use that core if you're a glutton for punishment to do the very same things you could do with the libraries, only with more typing.
17 Linux—the Academic Microsoft Windows
The holy trinity: Linux, gcc, and Netscape. Of course, it’s just another orthodoxy.
In six years Netscape has dropped off the list, partially eaten by its competitors.
These have become icons not because of what they are, but because of what they are not: Microsoft. But technically, they’re not that hot.
See my previous comment about simplicity. They're icons because they're simple. If they get to elaborate, like Netscape did, they'll be displaced. Netscape is now mostly Mozilla, but it's still piggishly slow compared to Opera — and compared to IE.
And Microsoft has been working hard, and I claim that on many (not all) dimensions, their corresponding products are superior technically. And they continue to improve. Linux may fall into the Macintosh trap: smug isolation leading to (near) obsolescence. Besides, systems research is doing little to advance the trinity.
In the early days, Linux tried that trap, with the "if it's not easy to write software, why should it be easy to use it?" syndrome. But I just set up a Linux partition on my laptop, and Ubuntu went on just as smoothly as Windows did.
18 Startups
Startups are the dominant competition for academia for ideas, funds, personnel, and students. (Others are Microsoft, big corporations, legions of free hackers, and the IETF.) In response, government-funded and especially corporate research is directed at very fast ‘return on investment’. This distorts the priorities: Research is bent towards what can make big money (IPO) in a year.
That paragraph was fizzling out even as he spoke.
Horizon is too short for longterm work. (There go infrastructure and the problems of scale.)

Funding sources (government, industry) perceive the same pressures, so there is a vicious circle.

The metric of merit is wrong.

Stanford now encourages students to go to startups because successful CEOs give money to the campus. The new president of Stanford is a successful computer entrepreneur.
I'm not familiar with who he's talking about. I wonder if he's still a successful computer entrepreneur?
19 Grandma
Grandma’s on line. This means that the industry is designing systems and services for ordinary people.
That's actually a pretty brilliant idea, come to think of it...
The focus is on applications and devices, not on infrastructure and architecture, the domain of systems research. The cause is largely marketing, the result a proliferation of incompatible devices. You can’t make money on software, only hardware, so design a niche gimmick, not a Big New Idea.

Programmability—once the Big Idea in computing—has fallen by the wayside. Again, systems research loses out.
Grandma brings other things to the table, though. We now have a huge user base, with most users no more familiar with the innards of their computers than they are with the innards of their toaster ovens. What we've done is build a tool, a completed product. The dangers now include falling into the Microsoft trap — making the tool longer, lower, leaner, wider, with more road-hugging weight and 40 percent more cheese in the interests of of bringing out new models every two years; and of becoming elitists with overly complicated systems that let us sneer at the unenlightened who aren't bright like we are, so they can't make the systems work.
20 Things to Do
Startups are too focused on short time scale and practical results to try new things. Big corporations are too focused on existing priorities to try new things. Startups suck energy from research. But gold rushes leave ghost towns; be prepared to move in.
I think we've pretty much moved on from the dot Com "revolution."
Go back to thinking about and building systems. Narrowness is irrelevant; breadth is relevant: it’s the essence of system. Work on how systems behave and work, not just how they compare. Concentrate on interfaces and architecture, not just engineering. Be courageous. Try different things; experiment. Try to give a cool demo.
It's a tool. Concentrate on making the tool work better, in more environments.
Funding bodies: fund more courageously, particularly longterm projects. Universities, in turn, should explore ways to let students contribute to longterm projects.

Measure success by ideas, not just papers and money. Make the industry want your work.

21 Things to Build
There are lots of valid, useful, interesting things to do. I offer a small sample as evidence. If the field is moribund, it’s not from a lack of possibilities.

Only one GUI has ever been seriously tried, and its best ideas date from the 1970s. (In some ways, it’s been getting worse; today the screen is covered with confusing little pictures.) Surely there are other possibilities. (Linux’s interface isn’t even as good as Windows!)
He said that six years ago, recall. My Gnome desktop is prettier and more usable in most respects than Window...
There has been much talk about component architectures but only one true success: Unix pipes. It should be possible to build interactive and distributed applications from piece parts.
2000 talking again. Since then, XML, RSS, WiFi, Bluetooth, thin clients, and about a dozen other neato things...
The future is distributed computation, but the language community has done very little to address that possibility. The Web has dominated how systems present and use information: the model is forced interaction; the user must go get it. Let’s go back to having the data come to the user instead.
You mean like web services?
System administration remains a deeply difficult problem. Unglamorous, sure, but there’s plenty of room to make a huge, even commercial, contribution.
That'd be more GUI, with the added benefit that most "GUI" functions can be delivered over a browser and the added disadvantage that when your GUI breaks you'd better have a command-line backup option, or you've just changed your machine into a doorstop...
22 Conclusions
The world has decided how it wants computers to be. The systems software research community influenced that decision somewhat, but very little, and now it is shut out of the discussion. It has reached the point where I doubt that a brilliant systems project would even be funded, and if funded, wouldn’t find the bodies to do the work. The odds of success were always low; now they’re essentially zero.

The community—universities, students, industry, funding bodies—must change its priorities.

The community must accept and explore unorthodox ideas.

The community must separate research from market capitalization.
Posted by: Fred 2006-03-02
http://www.rantburg.com/poparticle.php?ID=144254