Submit your comments on this article | |||||||||||||||||||||||||||||||||||||||||||
Science & Technology | |||||||||||||||||||||||||||||||||||||||||||
Systems Software Research is Irrelevant | |||||||||||||||||||||||||||||||||||||||||||
2006-03-02 | |||||||||||||||||||||||||||||||||||||||||||
Rob Pike, Bell Labs Feb 21, 2000 1 A Polemic This talk is a polemic that distills the pessimistic side of my feelings about systems research these days. I wonÂ’t talk much about the optimistic side, since lots of others can do that for me; everyoneÂ’s excited about the computer industry. I may therefore present a picture somewhat darker than reality. However, I think the situation is genuinely bad and requires action. 2 Thesis Systems software research has become a sideline to the excitement in the computing industry. When did you last see an exciting noncommercial demo? Ironically, at a time when computing is almost the definition of innovation, research in both software and hardware at universities and much of industry is becoming insular, ossified, and irrelevant. There are many reasons, some avoidable, some endemic. There may be ways to improve the situation, but they will require a communitywide effort. 3 Definitions... 4 A Field in Decline "Who needs new operating systems, anyway?" you ask. Maybe no one, but then that supports my thesis. "But now there are lots of papers in file systems, performance, security, web caching, etc.," you say. Yes, but is anyone outside the research field paying attention?
Hardware has changed dramatically; software is stagnant.
Microsoft, mostly. Exercise: Compare 1990 Microsoft software with 2000.
Innovation? New? No, itÂ’s just another copy of the same old stuff. OLD stuff. Compare program development on Linux with Microsoft Visual Studio or one of the IBM Java/web toolkits.
Web caches, web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even userlevel applications.
9 What Happened? A lot of things:... 10 PC Hardware became cheap, and cheap hardware became good. Eventually, if it didnÂ’t run on a PC, it didnÂ’t matter because the average, mean, median, and mode computer was a PC. Even into the 1980s, much systems work revolved around new architectures (RISC, iAPX/432, Lisp Machines). No more. A major source of interesting problems and, perhaps, interesting solutions is gone.
11 Microsoft Enough has been said about this topic. (Although people will continue to say lots more.) Microsoft is an easy target, but itÂ’s a scapegoat, not the real source of difficulty. Details to follow.
The web happened in the early 1990s and it surprised the computer science community as much as the commercial one. It then came to dominate much of the discussion, but not to much effect. Business controls it. (The web came from physicists and prospered in industry.)
To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you donÂ’t honor the standards youÂ’re marginalized. Estimate that 90-95% of the work in Plan X was directly or indirectly to honor externally imposed standards. At another level, instruction architectures, buses, etc. have the same influence. With so much externally imposed structure, thereÂ’s little slop left for novelty.
TodayÂ’s graduating PhDs use Unix, X, Emacs, and Tex.
15 Change of scale With so many external constraints, and so many things already done, much of the interesting work requires effort on a large scale. Many person-years are required to write a modern, realistic system. That is beyond the scope of most university departments. Also, the time scale is long: from design to final version can be five years. Again, that’s beyond the scope of most grad students. This means that industry tends to do the big, defining projects—operating systems, infrastructure, etc.— and small research groups must find smaller things to work on. Three trends result: 1. Don’t build, measure. (Phenomenology, not newI believe this is the main explanation of the SOSP curve. 16 Unix New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture—and some do—the first thing to build is the Unix emulation layer. How can operating systems research be relevant when the resulting operating systems are all indistinguishable? There was a claim in the late 1970s and early 1980s that Unix had killed operating systems research because no one would try anything else. At the time, I didn’t believe it. Today, I grudgingly accept that the claim may be true (Microsoft notwithstanding). A victim of its own success: portability led to ubiquity. That meant architecture didn’t matter, so now there’s only one. Linux is the hot new thing... but it’s just another Unix.
The holy trinity: Linux, gcc, and Netscape. Of course, itÂ’s just another orthodoxy.
Startups are the dominant competition for academia for ideas, funds, personnel, and students. (Others are Microsoft, big corporations, legions of free hackers, and the IETF.) In response, government-funded and especially corporate research is directed at very fast ‘return on investment’. This distorts the priorities: Research is bent towards what can make big money (IPO) in a year.
Funding sources (government, industry) perceive the same pressures, so there is a vicious circle. The metric of merit is wrong. Stanford now encourages students to go to startups because successful CEOs give money to the campus. The new president of Stanford is a successful computer entrepreneur.
GrandmaÂ’s on line. This means that the industry is designing systems and services for ordinary people.
Programmability—once the Big Idea in computing—has fallen by the wayside. Again, systems research loses out.
Startups are too focused on short time scale and practical results to try new things. Big corporations are too focused on existing priorities to try new things. Startups suck energy from research. But gold rushes leave ghost towns; be prepared to move in.
Measure success by ideas, not just papers and money. Make the industry want your work. 21 Things to Build There are lots of valid, useful, interesting things to do. I offer a small sample as evidence. If the field is moribund, itÂ’s not from a lack of possibilities. Only one GUI has ever been seriously tried, and its best ideas date from the 1970s. (In some ways, itÂ’s been getting worse; today the screen is covered with confusing little pictures.) Surely there are other possibilities. (LinuxÂ’s interface isnÂ’t even as good as Windows!)
The world has decided how it wants computers to be. The systems software research community influenced that decision somewhat, but very little, and now it is shut out of the discussion. It has reached the point where I doubt that a brilliant systems project would even be funded, and if funded, wouldn’t find the bodies to do the work. The odds of success were always low; now they’re essentially zero. The community—universities, students, industry, funding bodies—must change its priorities. The community must accept and explore unorthodox ideas. The community must separate research from market capitalization. | |||||||||||||||||||||||||||||||||||||||||||
Posted by:Fred |
#24 My word for him, AT&T Labs are history, get over it. Heh. Rob Pike now works for Google. As does Guido v. Rossum. Rob's a great man; he's one of the original developers of Unix. You should read his book, "Practical Programming". It's a marvel of clarity. |
Posted by: KBK 2006-03-02 23:54 |
#23 One of the real big revolutions in software is the work on component architectures, the most visible being the Eclipse development environment. |
Posted by: Ptah 2006-03-02 20:45 |
#22 Yup. When real $$ are at stake, users want good enough and not any fancier. Dead right, Alan! |
Posted by: lotp 2006-03-02 19:31 |
#21 Never meant to denigrate your experience, lotp. While I never did anything military I have worked with some real time control software for things like cutters and slitters for roll goods. The last few years I've been doing a lot more managing like the current migration of 2 dozen systems with ~50 Tb of data from Solaris to AIX. The point I was trying to make was that there is almost always more than one way to skin the software cat and the choices can rarely if ever be truly said to be scientific beyond a certain point. Elegant is a word more frequently used aesthetically than scientifically ( I had one boss who name me Elegant Alan for my propensity to turn out tight code ). As far as those pilots of your go, I bet they were quite appreciative of your creativity whether you were or not. ;^) But, again, the main point is that Mr. Pike doesn't seem to know that the $$$$ are with the users for whom there is such a thing as good enough. |
Posted by: AlanC 2006-03-02 18:41 |
#20 I ask the groups of students to sort the blocks into some rational order. The catch is that there is no correct answer and virtually every group comes up with different answers. Then I ask them how the user wants them? The point of the ramble is that there is no way to scientifially solve the "problem". Sure - that's pretty obvious WRT user interfaces. When you talk about the architecture underneath them, however, there ARE scientific ways to characterize better or poorer approaches. Mai-Hwa Chen's complexity metric for distributed enterprise systems is one. |
Posted by: lotp 2006-03-02 18:17 |
#19 Lots of expertise and experience here at the Burg! A couple of points. 3dc, when I ran a group that produced language tools (compilers, link editors, symbolic debuggers with embedded chip simulators) for the military real-time world (avionics, flight control systems, space-based systems) along with a hard real-time OS for embedded systems there was no way on God's green earth those apps could execute successfully in a virtual, emulated environment. We're talking microseconds for very complex calculations that do things like shoot down missiles or keep fly-by-wire fighters in the air. So unfortunately your box in the basement wouldn't have solved DOD's problem (then or now), although it suits nicely for some other application domains. Which makes the point that there is no such thing as a single solution .... Alan, I put out a bunch of words, so you might not have noticed that I spent 25 years in the "real world" doing software and managing software engineering (and some hardware design) before I went academic. I didn't miss the fact that "THE USE (of software has) a large subjective / artistic / creative component." And neither has the academic comp sci world. That's exactly what adaptive user interfaces are all about. When these move into general use, the users will focus on how they want to work and what they want the software to look like. That said, a good portion of my practitioner career involved "hard" real-time applications, or at least softer real-time process control and/or communications systems. I've written or managed projects ranging from a couple hundred lines of elegant code to a system that had about 4 million lines of high level source code. In some of those systems, 'creativity' was not what we wanted, needed or could tolerate. Elegant design? Absolutely. Tight code and innovative algorithms? You bet - if they were designed and written so as to make it possible to verify their behavior and validate that they met the requirement spec. The pilots of planes whose avionics and flight control systems were built with our code were glad we weren't 'creative'. ;-) |
Posted by: lotp 2006-03-02 18:15 |
#18 Yesterday having done both taxes and FAFSA (kids student loan) is a perfect example of what can go wrong with software. The FASFA site refuses to work with anything but IE or Netscape. (not even Mozilla) On top of that they don't do paper anymore so if the kid wants a student loan... ITS THEIR WAY OR THE HIGHWAY. The IRS site is a tad better but will not allow you to efile for free unless your gross is < 50K. If its greater you have to pay to use TurboTax or the like. TurboTax implies windows - a M$S requirement to e-file. So the Web experience on these government sites can make one curse the software. (rightly or wrongly the web presence becomes software in the users eyes.) On another point... I have XEN 3.0 running on some old hardware downstairs. This recent operating enviroment with true virtualization solves DARPAs ADA vs MIPS complaint Ltop was complaining about... oh the currently running vitrual OSs on that basement junk box? Centos and Debian... |
Posted by: 3dc 2006-03-02 17:48 |
#17 Phil_b & Lotp have it just about right...but there's one thing that is very hard for the academic community to grasp. Software AND THE USE THEREOF have a large subjective / artistic / creative component. When teaching RDB to students I like to start with a simple physical exercize. I break students into groups and then give each group a set of blocks. There are 3 large square blocks, one red, one blue and one green; then there are 3 medium sized square blocks and then 3 small; there are also round and triangle blocks the same way. I ask the groups of students to sort the blocks into some rational order. The catch is that there is no correct answer and virtually every group comes up with different answers. Then I ask them how the user wants them? The point of the ramble is that there is no way to scientifially solve the "problem". |
Posted by: AlanC 2006-03-02 17:12 |
#16 My 2c worth and in my time, I've worked on OSes, compilers and distributed systems. The problem in a nutshell is 'system research' is an oxymoron. Or put another way, he is trying to apply the scientific method as practiced by academics to the development of complex systems and it doesn't apply. The scientific method drives broad theoretical understanding through investigation and identification of facts (in the context of those theories). Whilst there are theoretical constructs that need to be elaborated concerning complex systems, they have no (or almost no) impact of the actual development of complex software systems, which is developed through an almost completely heuristic process. It may sound counter-intuitive to some people, but as systems get more complex, they necessarily become more heuristic, becuase fewer and fewer people understand the system in its entireity. In fact, most modern software systems have already passed the point where any one person can understand them. lotp is right in that engineered systems built on solid systems theories could exist (assuming the theories existed). However, in practice they don't exist. |
Posted by: phil_b 2006-03-02 17:00 |
#15 I worked for LSI Logic (formerly Symbios, Symbios Logic, NCR Microelectronics) as a software test engineering technician, running a software test team. Our basic function was to test software compatibility between a dozen or so different software manufacturers and our own in-house SCSI (Small Computer Systems Interface) hardware and software. Basically, we tested our interface hardware and supporting software with Microsoft (Windows 3.X/95/98/2000/ME/NT), OS/2, Unix, SCO, Linux, and a half-dozen other minor operating systems, network software and hardware, peripheral devices, and applications software packages. We also used as wide a variety of hardware as possible - different processors, different chipsets, different manufacturer labels, etc. There are hundreds of ways to increase performance, both internally and externally, that the computer industry can make, if it's willing. One of the first things the entire industry needs to learn is that "one size fits all" doesn't satisfy anybody. I use three different spreadsheets because there are things I can do with one that the other two won't do - mostly things the spreadsheet designer never considered. I use two different word processors and FOUR internet browsers, all for the same reason. The biggest problem "software researchers" have is their own narrow-mindedness. They never expect people to use their product except in the way they designed them to work, while the user-public wants things that make THEIR life easier. Business users want one thing, home users want something else, and professionals want still something different. Even in these rather loose divisions, there are layers - experienced users, semi-experienced, and total novices. Yet software is designed to meet a "general" user's needs. It doesn't work. Until the "software industry" learns that it needs to create products that address the different needs of each subset of its clients, it will continue to meet very few of the needs of anyone. |
Posted by: Old Patriot 2006-03-02 16:51 |
#14 Well said Alan. |
Posted by: Visitor 2006-03-02 16:50 |
#13 This is a funny discussion. I've been in IT for 3 decades and have been around the barn a few times. This sounds like the typical bit-head that can't quite make the connection between real people doing real work, and all the pretty technical bells and whistles that float his boat. (how's that for mixing a metaphor?) I've worked on PCs where it was 24k of memory and the big thing (literally) in floppies was 8" and 1 mb. I worked on a PC that could hot swap between the two competing OSs DOS and CP/M; and on and on. You know what? Not one user gave a damn. They wanted a computer to be like a fork. You picked up it does its job....end of discussion. Ivory tower types have their place, just not in public. ;^) |
Posted by: AlanC 2006-03-02 16:44 |
#12 Even into the 1980s, much systems work revolved around new architectures (RISC, iAPX/432, Lisp Machines). In the 1986-1988 timeframe DOD put together a panel of academics and industry people and asked them to recommend the level at which they should set standards for computing. Should they pick a CPU architecture? A language and CPU? or ??? DARPA was pushing the MIPS, Inc. RISC chip, which they had funded heavily, and envisioned virtual machines on top of the chip for various applications. Fairchild was pushing a CISC chip, but they had just got themselves sold to Japan. The eventual recommendation was to standardize on the high level language (Ada) and let the rest all change underneath it, since so many advances were occuring in chip technologies. DARPA was pretty disappointed that a RISC chip with a virtual machine on top wasn't the choice, but those of us who were doing things in avionics, space based systems etc. breathed a sigh of relief.... And meanwhile, digital signal processors were making incredible strides and many systems now have both standard CPUs and DSPs as needed. |
Posted by: lotp 2006-03-02 16:40 |
#11 With the multi-cpu cores, cell processors single computers will be turned into clusters that are part of networked clusters and aspects of his orginal Plan-9 operating system will start migrating into the real world. Perhaps his research was ahead of the hardware? BTW... the Plan 9 GUI is really sad. |
Posted by: 3dc 2006-03-02 16:38 |
#10 Only two of those -- POSIX and TCP/IP -- are operating-system issues. The rest are applications. He's talking out of his ass, here. I'd disagree a little on that point, RC. What he's bemoaning is that the infrastructure for large-scale systems consists of these technologies being glued together, rather than engineered-from-the-ground-up systems software. Much as I'm critical of him in my comment above, he's not entirely wrong. DOD and others are having to invest huge sums of money and manpower to figure out how to secure this mess of stuff, what requirements to set for procuring new (secure) systems etc. Corporations are spending huge amounts of money on the data and apps side too. If there were breakthroughs in systems software designs, it conceivably could make a huge difference. I just think he was out of touch with what WAS going on in the research at the very time he was speaking. They haven't gelled together like the technologies that really launched the Web yet, tho, so he doesn't see them. |
Posted by: lotp 2006-03-02 16:17 |
#9 Working in a mature technology isn't much fun. This guy should think about a career change. English? They're into deconstruction. |
Posted by: Nimble Spemble 2006-03-02 16:15 |
#8 (LinuxÂ’s interface isnÂ’t even as good as Windows!) Pure BS. I'd rather use vi than 90% of the editors on Windows. Heck, I often use vi on Windows. |
Posted by: Robert Crawford 2006-03-02 16:10 |
#7 Hardware has changed dramatically; software is stagnant. What freakin' planet is he living on? If you spend your life inside the theory of operating systems, OK, things probably look stagnant. Out here in the Real World, where we're solving Real Problems for Real People, new techniques and new ideas are coming faster than we can absorb them. This'll probably sound odd, but the dotcom bust forced things to mature, and that opened up a hell of a lot of opportunities. What this guy's doing is the software equivalent of claiming architecture ended when the arch came along. Besides, LinuxÂ’s cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure. The process of creating software is as critical as the design of the software. Maybe moreso. This sounds like a snob sniffing that that kind of work isn't done by proper gentlemen! 8 What is Systems Research these days? Web caches, web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even userlevel applications. Ya know why? Because that's where the problems are today. You have an intra-network serving 5,000+ locations. Each location needs access to applications that are run in central locations, and any downtime stops money coming into your company; a project that's coming soon will have servers at each store communicating with the other sites as well. How do you make that work well? What's the best architecture for serving the applications? For the network? What are the implications of allowing the public some limited access to resources on those networks? This guy just doesn't like that his chosen specialization has reached a plateau. His name's familiar, though I can't remember where from. (Oh, and he's wrong about big changes in hardware. The essential architecture is largely the same, what has changed is the process of manufacturing the hardware, which allowed for higher speeds. Heck, outside the CPU, bus speeds are still below 100MHz.) To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you donÂ’t honor the standards youÂ’re marginalized. Estimate that 90-95% of the work in Plan X was directly or indirectly to honor externally imposed standards. Only two of those -- POSIX and TCP/IP -- are operating-system issues. The rest are applications. He's talking out of his ass, here. |
Posted by: Robert Crawford 2006-03-02 16:08 |
#6 What SPoD said. |
Posted by: lotp 2006-03-02 16:05 |
#5 Fred's and my in-line comments stepped on each other, but here's my take in summary. My perspective? I'm a researcher in Artificial Intelligence now, but was a practitioner for 25 years before coming over to the academic side. DOD paid for the development of TCP/IP and part of the MULTICS project from which UNIX spun off. For over a decade they remained mostly research environments, as academics played with them and DOD used them, learning how to exploit them well. When telecomms and chip technology were ready, they exploded and spawned 15 years of applications technologies including the Web. Even when this was written, in 2000, there was lots of long-term research in progress on most of the areas Pike mentions. Operating environments? Consider functional languages like Haskell, which Nokia has been using in its cell and smart phones for several years now. As we learn more about scalability, expect functional environments to replace standard operating systems in more and more places. Component software? Pike missed the fact that those standards he bemoans have resulted in enterprise application integration. An insurance company client of mine in 1999-2000, the time of this talk, had already put their entire corporation on an integrated system in which whole application programs and databases were replaceable components. Not only the technologies, but even business processes have been changed out seamlessly since then. GUIs? Check out the research in 2000 on adaptive user interfaces at this DOD project site. Here's another site, a bit heavy on the comp sci and decision theory and AI jargon, but the screen shots give you an idea of the kinds of things that are already working in research environments. He's right about Microsoft, tho, since they've been publishing about their work in adaptive interfaces since 1999, if not earlier. Can't make money on software? Tell that to the guys in Richmond. Or talk to me - I've taken successful hardware and software products into niche markets quite profitably. Bottom line: this talk reads like the lament of a guy who no longer gets to have plush 5 year budgets and who is out of touch with a technical and business environment in which his company no longer gets to set the technical terms of play. That's why Nokia is out there with Haskell-based devices and both Lucent and Bell Labs are struggling. |
Posted by: lotp 2006-03-02 16:00 |
#4 The issue is that the OS has become almost irrelevant and will be completely irrelevent in another generation or so. It is just not where the action is anymore because the OS has solved the problems that it was created for. Time to move on, until a major paradigm shift occurs. Paradigm shifts can not be predicted nor engineered, they just happen, Usually by accident. I don't think people are doing major research on how to make a better paper clip anymore either. |
Posted by: Dave 2006-03-02 15:43 |
#3 Good Comment Fred. As the not so typical end user I have my .02¢ Most people expect devices that have processors to function like a toaster. They just have to work. . As someone who starterd out with CPM on the "personal computer" and Using termials on mainframes we have come a heck of a long way. He is wrong about Linux, it is not a clone. It has glue that lets it act like UNIX but it leaves UNIX in the dust. It's kernel is changing on a daily basis. UNIX is ossified. Microsoft isn't a "problem". End user expectations are the problem. Mircosoft has a monopoly in peoples minds to some extent. My wife uses OS X at work and Mandriva 2006 Linux on her computer at home. It took almost zero time for her to adapt to not having a Microsoft operating system when her Windows computer died. All the Applications she used on her Windows computer are on her Linux computer. In some respects her computing experince has improved by the switch to a Linux computing system. But it really doesn't matter. It's just a tool. You use the system that works for you. If it wasn't working for her I would know it. I would go get a HP or Dell PC buy and load the apps she wanted and that would be that. Microsoft is just to expensive for me to run. Free Software suits my needs better. I don't have thousands of dollars invested in applications. But if I need a Microsoft operating system and application I will buy it. It's been a long time since I needed one. The days of me having a dual booting system are just about over. I have an empty 300 gig drive here waiting for an LINUX OS upgrade when I get in the mood I will not be buy a copy of XP and making a MS file system partition on it. I just don't need a Microsoft OS any more. My word for him, AT&T Labs are history, get over it. |
Posted by: SPoD 2006-03-02 15:42 |
#2 3. Take an existing thing and tweak it. This is how new discovery happens. Weed cutting with nylon string would have been laughed at in 1960..... |
Posted by: Visitor 2006-03-02 15:30 |
#1 Good arguments Fred. |
Posted by: 3dc 2006-03-02 15:01 |