You have commented 339 times on Rantburg.

Your Name
Your e-mail (optional)
Website (optional)
My Original Nic        Pic-a-Nic        Sorry. Comments have been closed on this article.
Bold Italic Underline Strike Bullet Blockquote Small Big Link Squish Foto Photo
Science & Technology
Systems Software Research is Irrelevant
2006-03-02
Rob Pike, Bell Labs
Feb 21, 2000

1 A Polemic
This talk is a polemic that distills the pessimistic side of my feelings about systems research these days. I wonÂ’t talk much about the optimistic side, since lots of others can do that for me; everyoneÂ’s excited about the computer industry. I may therefore present a picture somewhat darker than reality. However, I think the situation is genuinely bad and requires action.

2 Thesis
Systems software research has become a sideline to the excitement in the computing industry. When did you last see an exciting noncommercial demo? Ironically, at a time when computing is almost the definition of innovation, research in both software and hardware at universities and much of industry is becoming insular, ossified, and irrelevant. There are many reasons, some avoidable, some endemic. There may be ways to improve the situation, but they will require a communitywide effort.

3 Definitions...

4 A Field in Decline
"Who needs new operating systems, anyway?" you ask. Maybe no one, but then that supports my thesis.

"But now there are lots of papers in file systems, performance, security, web caching, etc.," you say. Yes, but is anyone outside the research field paying attention?
This is the central part of his thesis: that there are really only two operating system choices in the world, *nix and Windows. But he's ignoring the Jurassic period of computing, when the "home" or business machine was young: CP/M, various menu-driven systems, the venerable TRS-80, Geos, the Amiga, the Atari ST... That was a pretty Darwinian period, where an OS could pop up on Monday and be gone by Tuesday afternoon, as I'll discuss below...
5 Systems ResearchÂ’s Contribution to the Boom...
Hardware has changed dramatically; software is stagnant.
>
199020002006
Hardware
33 MHz Mips R3000
32 megabytes of RAM
10 Mbs Ethernet

600 MHz Alpha or Pentium III
512 megabytes of RAM
100 Mbs Ethernet

3.2 GHz AMD 64
1-2 GB of RAM
55 Mbs Wireless Ethernet
Software
Unix
X Windows
Emacs
TCP/IP

Unix
X Windows
Emacs
TCP/IP
Netscape


Unix/Linux
Windows
X Windows
Emacs + Others
TCP/IP
Opera
USB
Good progression, though there are a few things missing, like IPX/SPX...
6 Where is the Innovation?
Microsoft, mostly. Exercise: Compare 1990 Microsoft software with 2000.
That'd be Windows 3.0, or maybe even 2.0, compared to Windows 2000.
If you claim thatÂ’s not innovation, but copying, I reply that Java is to C++ as Windows is to the Macintosh: an industrial response to an interesting but technically flawed piece of systems software.
Both are evolutions from base systems that won their particular competitions.
If systems research was relevant, weÂ’d see new operating systems and new languages making inroads into the industry, the way we did in the Â’70s and Â’80s. Instead, we see a thriving software industry that largely ignores research, and a research community that writes papers rather than software.
And here's the basic flaw in his reasoning: he's looking at the past and expecting the future to be a continuation of it. Looking at it from a different standpoint, the base form of Unix won its competition early on: a tiny kernel and the basic file system layout. Everything after that is elaboration. It was a thing of beauty, simple and intuitive. It's just like Bridge 1.0, invented by Og and Zug in 9276 B.C. Everything since has been refinement, and the competing approaches are best forgotten.
7 Linux
Innovation? New? No, itÂ’s just another copy of the same old stuff. OLD stuff. Compare program development on Linux with Microsoft Visual Studio or one of the IBM Java/web toolkits.
In 2000, Linux was still in its infancy. The early efforts were geared much more toward the academics and the hacker community. In 2006 we're looking at products that are approaching full maturity, ready for prime time.
LinuxÂ’s success may indeed be the single strongest argument for my thesis: The excitement generated by a clone of a decadesold operating system demonstrates the void that the systems software research community has failed to fill. Besides, LinuxÂ’s cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure.
If you regard the basic Unix approach as a building block, Linux is a good development, more than just "clever." I can remember using SCO Xenix and QNX on early PC boxes, and they were actually more mature (for the time) that Linux was the first time I looked at it. Both were proprietary, and both fell by the wayside because the Linux development model beat them out. MS DOS beat out its competition in a similar manner, running not just on IBM PCs, but also on 8086 clones. The Mac was pretty, in many ways more technologically sophisticated, but it didn't compete — PCs were cheaper, the software was cheaper, and users weren't restricted to the Apple brand. Both the development and marketing models are tied to the system.
8 What is Systems Research these days?
Web caches, web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even userlevel applications.
Once the Wright brothers figured out how to make airplanes, there wasn't a need for Boeing to reinvent them. The same applies to software. Despite the bells and whistles that are hung on new versions of "Office" products, they remain basically the same, with word processing, spreadsheet, presentation, and perhaps database components. The spreadsheet, the original "killer app," didn't exist before Visicalc, and hasn't changed a whole lot since Excel 3.0. "Killer apps" are pretty few and far between.
Mostly, though, itÂ’s just a lot of measurement; a misinterpretation and misapplication of the scientific method. Too much phenomenology: invention has been replaced by observation. Today we see papers comparing interrupt latency on Linux vs. Windows. They may be interesting, they may even be relevant, but they arenÂ’t research.
80 percent of Academe is like 80 percent of everything else, made up of hacks and place holders. Another 10 percent is actually destructive, which leaves 10 percent to do the actual thinking.
In a misguided attempt to seem scientific, thereÂ’s too much measurement: performance minutiae and bad charts. By contrast, a new language or OS can make the machine feel different, give excitement, novelty. But today thatÂ’s done by a cool web site or a higher CPU clock rate or some cute little device that should be a computer but isnÂ’t. The art is gone. But art is not science, and thatÂ’s part of the point. Systems research cannot be just science; there must be engineering, design, and art.

9 What Happened?
A lot of things:...

10 PC
Hardware became cheap, and cheap hardware became good. Eventually, if it didnÂ’t run on a PC, it didnÂ’t matter because the average, mean, median, and mode computer was a PC. Even into the 1980s, much systems work revolved around new architectures (RISC, iAPX/432, Lisp Machines). No more. A major source of interesting problems and, perhaps, interesting solutions is gone.
They went prior to the development of the graphical browser, which would have allowed them to coexist, though in niche markets, kind of like where Solaris is now. The PC's advantage, like that of Linux, by the way, was its open architecture. All advantages bring with them disadvantages, but the architecture's been changing with new developments — albeit at a fairly stately rate due to the necessity of having (most) everyone agree on standards.
Much systems work also revolved around making stuff work across architectures: portability. But when hardwareÂ’s all the same, itÂ’s a nonissue... And thatÂ’s just the PC as hardware; as software, itÂ’s the same sort of story.

11 Microsoft
Enough has been said about this topic. (Although people will continue to say lots more.) Microsoft is an easy target, but itÂ’s a scapegoat, not the real source of difficulty. Details to follow.
Microsoft makes an easy target for many because it tries to take a lowest common denominator approach, while gouging as much cash from its adoring public as the traffic will bear. It beat out its competition by not being proprietary: recall using Samna Write/Ami Pro and Word Perfect on Windows 3.0/3.1, and Lotus 1-2-3, and Netscape, and any number of other products. Because of its corporate strategy, Microsoft either incorporated the best of the best into Windows (remember when Netscape used to cost money?) or tried to do a better job with its own product at the same price (Microsoft Office).
12 Web
The web happened in the early 1990s and it surprised the computer science community as much as the commercial one. It then came to dominate much of the discussion, but not to much effect. Business controls it. (The web came from physicists and prospered in industry.)
There's a Year 2000 statement. Six years later, with the development of dynamic web content, business is still on the web, with advertising and order entry systems and such, but so is everyone else, to include millions of bloggers using pretty easy-to-use but very sophisticated software.
Bruce Lindsay of IBM: HDLC C HTTP/HTML; 3270s have been replaced by web browsers. (Compare with Visicalc and PC.)
Heh heh. I haven't seen a 3270 emulator in years.
Research has contributed little, despite a huge flow of papers on caches, proxies, server architectures, etc.
But there are lots of refinements on the basic model. The refinements make the software more usable, reaching a bigger audience, which provides its own level of feedback to make the software still more usable... From the management standpoint, we've gone from the base development cycle to the cash cow period, and I'm pretty sure we're still at an early stage in that.
13 Standards
To be a viable computer system, one must honor a huge list of
large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you donÂ’t honor the standards youÂ’re marginalized. Estimate that 90-95% of the work in Plan X was directly or indirectly to honor externally imposed standards. At another level, instruction architectures, buses, etc. have the same influence. With so much externally imposed structure, thereÂ’s little slop left for novelty.
All those standards are building blocks, and each and every one of them could be superceded by a better approach. If you can come up with a better approach, by all means do so; until you do, these work, and they've been refined to the point where they work very well.
Plus, commercial companies that ‘own’ standards, e.g. Microsoft, Cisco, deliberately make standards hard to comply with, to frustrate competition. Academia is a casualty.
That's maybe a legitimate gripe, though my heart doesn't bleed for academia. There's a balance between standards and confinement. And note that six years after the presentation was written Cisco actually has competition in its arena, despite being guardian of the standards.
14 Orthodoxy
TodayÂ’s graduating PhDs use Unix, X, Emacs, and Tex.
Users, I'd point out, don't...
ThatÂ’s their world. ItÂ’s often the only computing world theyÂ’ve ever used for technical work.
This has slowly changed over the past six years, though I'm not sure academia realizes yet that it's behind the times compared to the world at large, a situation reversed from the way things were 20 years ago...
Twenty years ago, a student would have been exposed to a wide variety of operating systems, all with good and bad points. New employees in our lab now bring their world with them, or expect it to be there when they arrive. ThatÂ’s reasonable, but there was a time when joining a new lab was a chance to explore new ways of working.
Now you're likely to find a slightly different set of building blocks. I read somewhere that systems that belong to suit-and-tie guys are almost always Windows/IIS, and systems that belong to the sweatshirt and jeans set are almost always Linux/Apache. But if you're going to be in the wonderful world of IT, you've got to get used to the idea of going with what the customer uses. Sometimes you can show him/her/it a better way, and sometimes you make a lot of money fixing mistakes made 10 years ago. That's actually the fun of it, for some of us.
Narrowness of experience leads to narrowness of imagination. The situation with languages is a little better—many curricula include exposure to functional languages, etc. —but there is also a language orthodoxy: C++ and Java.
They're pretty common in the want ads, too, so don't discount them, even though on the job you'll likely end up using Visual Basic or PHP or C.
In science, we reserve our highest honors for those who prove we were wrong. But in computer science...

15 Change of scale
With so many external constraints, and so many things already done, much of the interesting work requires effort on a large scale. Many person-years are required to write a modern, realistic system. That is beyond the scope of most university departments. Also, the time scale is long: from design to final version can be five years. Again, that’s beyond the scope of most grad students. This means that industry tends to do the big, defining projects—operating systems, infrastructure, etc.— and small research groups must find smaller things to work on.

Three trends result:
1. DonÂ’t build, measure. (Phenomenology, not new
things.)
2. DonÂ’t go for breadth, go for depth. (Microspecialization, not systems work.)
3. Take an existing thing and tweak it.
I believe this is the main explanation of the SOSP curve.

16 Unix
New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture—and some do—the first thing to build is the Unix emulation layer. How can operating systems research be relevant when the resulting operating systems are all indistinguishable? There was a claim in the late 1970s and early 1980s that Unix had killed operating systems research because no one would try anything else. At the time, I didn’t believe it. Today, I grudgingly accept that the claim may be true (Microsoft notwithstanding).

A victim of its own success: portability led to ubiquity. That meant architecture didnÂ’t matter, so now thereÂ’s only one. Linux is the hot new thing... but itÂ’s just another Unix.
I go back to my comments on competition. Unix was the best system at the time, and its base system probably still is, not due to its features but due to its simplicity. Likewise the C language displaced, for all practical purposes, Pascal and ALGOL and ADA and a host of other languages, not because of its "rich programming environment" but because of its simplicity. You can take a core C implementation, without any libraries (to include the very basics like stdio.h) and build an entire new implementation. Or you can use that core if you're a glutton for punishment to do the very same things you could do with the libraries, only with more typing.
17 Linux—the Academic Microsoft Windows
The holy trinity: Linux, gcc, and Netscape. Of course, itÂ’s just another orthodoxy.
In six years Netscape has dropped off the list, partially eaten by its competitors.
These have become icons not because of what they are, but because of what they are not: Microsoft. But technically, theyÂ’re not that hot.
See my previous comment about simplicity. They're icons because they're simple. If they get to elaborate, like Netscape did, they'll be displaced. Netscape is now mostly Mozilla, but it's still piggishly slow compared to Opera — and compared to IE.
And Microsoft has been working hard, and I claim that on many (not all) dimensions, their corresponding products are superior technically. And they continue to improve. Linux may fall into the Macintosh trap: smug isolation leading to (near) obsolescence. Besides, systems research is doing little to advance the trinity.
In the early days, Linux tried that trap, with the "if it's not easy to write software, why should it be easy to use it?" syndrome. But I just set up a Linux partition on my laptop, and Ubuntu went on just as smoothly as Windows did.
18 Startups
Startups are the dominant competition for academia for ideas, funds, personnel, and students. (Others are Microsoft, big corporations, legions of free hackers, and the IETF.) In response, government-funded and especially corporate research is directed at very fast ‘return on investment’. This distorts the priorities: Research is bent towards what can make big money (IPO) in a year.
That paragraph was fizzling out even as he spoke.
Horizon is too short for longterm work. (There go infrastructure and the problems of scale.)

Funding sources (government, industry) perceive the same pressures, so there is a vicious circle.

The metric of merit is wrong.

Stanford now encourages students to go to startups because successful CEOs give money to the campus. The new president of Stanford is a successful computer entrepreneur.
I'm not familiar with who he's talking about. I wonder if he's still a successful computer entrepreneur?
19 Grandma
GrandmaÂ’s on line. This means that the industry is designing systems and services for ordinary people.
That's actually a pretty brilliant idea, come to think of it...
The focus is on applications and devices, not on infrastructure and architecture, the domain of systems research. The cause is largely marketing, the result a proliferation of incompatible devices. You canÂ’t make money on software, only hardware, so design a niche gimmick, not a Big New Idea.

Programmability—once the Big Idea in computing—has fallen by the wayside. Again, systems research loses out.
Grandma brings other things to the table, though. We now have a huge user base, with most users no more familiar with the innards of their computers than they are with the innards of their toaster ovens. What we've done is build a tool, a completed product. The dangers now include falling into the Microsoft trap — making the tool longer, lower, leaner, wider, with more road-hugging weight and 40 percent more cheese in the interests of of bringing out new models every two years; and of becoming elitists with overly complicated systems that let us sneer at the unenlightened who aren't bright like we are, so they can't make the systems work.
20 Things to Do
Startups are too focused on short time scale and practical results to try new things. Big corporations are too focused on existing priorities to try new things. Startups suck energy from research. But gold rushes leave ghost towns; be prepared to move in.
I think we've pretty much moved on from the dot Com "revolution."
Go back to thinking about and building systems. Narrowness is irrelevant; breadth is relevant: itÂ’s the essence of system. Work on how systems behave and work, not just how they compare. Concentrate on interfaces and architecture, not just engineering. Be courageous. Try different things; experiment. Try to give a cool demo.
It's a tool. Concentrate on making the tool work better, in more environments.
Funding bodies: fund more courageously, particularly longterm projects. Universities, in turn, should explore ways to let students contribute to longterm projects.

Measure success by ideas, not just papers and money. Make the industry want your work.

21 Things to Build
There are lots of valid, useful, interesting things to do. I offer a small sample as evidence. If the field is moribund, itÂ’s not from a lack of possibilities.

Only one GUI has ever been seriously tried, and its best ideas date from the 1970s. (In some ways, itÂ’s been getting worse; today the screen is covered with confusing little pictures.) Surely there are other possibilities. (LinuxÂ’s interface isnÂ’t even as good as Windows!)
He said that six years ago, recall. My Gnome desktop is prettier and more usable in most respects than Window...
There has been much talk about component architectures but only one true success: Unix pipes. It should be possible to build interactive and distributed applications from piece parts.
2000 talking again. Since then, XML, RSS, WiFi, Bluetooth, thin clients, and about a dozen other neato things...
The future is distributed computation, but the language community has done very little to address that possibility. The Web has dominated how systems present and use information: the model is forced interaction; the user must go get it. LetÂ’s go back to having the data come to the user instead.
You mean like web services?
System administration remains a deeply difficult problem. Unglamorous, sure, but thereÂ’s plenty of room to make a huge, even commercial, contribution.
That'd be more GUI, with the added benefit that most "GUI" functions can be delivered over a browser and the added disadvantage that when your GUI breaks you'd better have a command-line backup option, or you've just changed your machine into a doorstop...
22 Conclusions
The world has decided how it wants computers to be. The systems software research community influenced that decision somewhat, but very little, and now it is shut out of the discussion. It has reached the point where I doubt that a brilliant systems project would even be funded, and if funded, wouldnÂ’t find the bodies to do the work. The odds of success were always low; now theyÂ’re essentially zero.

The community—universities, students, industry, funding bodies—must change its priorities.

The community must accept and explore unorthodox ideas.

The community must separate research from market capitalization.
Posted by:Fred

#24  My word for him, AT&T Labs are history, get over it.

Heh. Rob Pike now works for Google. As does Guido v. Rossum.

Rob's a great man; he's one of the original developers of Unix. You should read his book, "Practical Programming". It's a marvel of clarity.
Posted by: KBK   2006-03-02 23:54  

#23  One of the real big revolutions in software is the work on component architectures, the most visible being the Eclipse development environment.
Posted by: Ptah   2006-03-02 20:45  

#22  Yup. When real $$ are at stake, users want good enough and not any fancier. Dead right, Alan!
Posted by: lotp   2006-03-02 19:31  

#21  Never meant to denigrate your experience, lotp. While I never did anything military I have worked with some real time control software for things like cutters and slitters for roll goods. The last few years I've been doing a lot more managing like the current migration of 2 dozen systems with ~50 Tb of data from Solaris to AIX.

The point I was trying to make was that there is almost always more than one way to skin the software cat and the choices can rarely if ever be truly said to be scientific beyond a certain point. Elegant is a word more frequently used aesthetically than scientifically ( I had one boss who name me Elegant Alan for my propensity to turn out tight code ).

As far as those pilots of your go, I bet they were quite appreciative of your creativity whether you were or not. ;^)

But, again, the main point is that Mr. Pike doesn't seem to know that the $$$$ are with the users for whom there is such a thing as good enough.

Posted by: AlanC   2006-03-02 18:41  

#20  I ask the groups of students to sort the blocks into some rational order. The catch is that there is no correct answer and virtually every group comes up with different answers.

Then I ask them how the user wants them?

The point of the ramble is that there is no way to scientifially solve the "problem".


Sure - that's pretty obvious WRT user interfaces.

When you talk about the architecture underneath them, however, there ARE scientific ways to characterize better or poorer approaches. Mai-Hwa Chen's complexity metric for distributed enterprise systems is one.
Posted by: lotp   2006-03-02 18:17  

#19  Lots of expertise and experience here at the Burg!

A couple of points.

3dc, when I ran a group that produced language tools (compilers, link editors, symbolic debuggers with embedded chip simulators) for the military real-time world (avionics, flight control systems, space-based systems) along with a hard real-time OS for embedded systems there was no way on God's green earth those apps could execute successfully in a virtual, emulated environment.

We're talking microseconds for very complex calculations that do things like shoot down missiles or keep fly-by-wire fighters in the air. So unfortunately your box in the basement wouldn't have solved DOD's problem (then or now), although it suits nicely for some other application domains. Which makes the point that there is no such thing as a single solution ....

Alan, I put out a bunch of words, so you might not have noticed that I spent 25 years in the "real world" doing software and managing software engineering (and some hardware design) before I went academic.

I didn't miss the fact that "THE USE (of software has) a large subjective / artistic / creative component." And neither has the academic comp sci world. That's exactly what adaptive user interfaces are all about. When these move into general use, the users will focus on how they want to work and what they want the software to look like.

That said, a good portion of my practitioner career involved "hard" real-time applications, or at least softer real-time process control and/or communications systems. I've written or managed projects ranging from a couple hundred lines of elegant code to a system that had about 4 million lines of high level source code. In some of those systems, 'creativity' was not what we wanted, needed or could tolerate.

Elegant design? Absolutely. Tight code and innovative algorithms? You bet - if they were designed and written so as to make it possible to verify their behavior and validate that they met the requirement spec.

The pilots of planes whose avionics and flight control systems were built with our code were glad we weren't 'creative'. ;-)
Posted by: lotp   2006-03-02 18:15  

#18  Yesterday having done both taxes and FAFSA (kids student loan) is a perfect example of what can go wrong with software.

The FASFA site refuses to work with anything but IE or Netscape. (not even Mozilla) On top of that they don't do paper anymore so if the kid wants a student loan... ITS THEIR WAY OR THE HIGHWAY.

The IRS site is a tad better but will not allow you to efile for free unless your gross is < 50K. If its greater you have to pay to use TurboTax or the like. TurboTax implies windows - a M$S requirement to e-file.

So the Web experience on these government sites can make one curse the software. (rightly or wrongly the web presence becomes software in the users eyes.)

On another point... I have XEN 3.0 running on some old hardware downstairs. This recent operating enviroment with true virtualization solves DARPAs ADA vs MIPS complaint Ltop was complaining about... oh the currently running vitrual OSs on that basement junk box? Centos and Debian...
Posted by: 3dc   2006-03-02 17:48  

#17  Phil_b & Lotp have it just about right...but there's one thing that is very hard for the academic community to grasp.

Software AND THE USE THEREOF have a large subjective / artistic / creative component.

When teaching RDB to students I like to start with a simple physical exercize. I break students into groups and then give each group a set of blocks. There are 3 large square blocks, one red, one blue and one green; then there are 3 medium sized square blocks and then 3 small; there are also round and triangle blocks the same way.

I ask the groups of students to sort the blocks into some rational order. The catch is that there is no correct answer and virtually every group comes up with different answers.

Then I ask them how the user wants them?

The point of the ramble is that there is no way to scientifially solve the "problem".
Posted by: AlanC   2006-03-02 17:12  

#16  My 2c worth and in my time, I've worked on OSes, compilers and distributed systems.

The problem in a nutshell is 'system research' is an oxymoron. Or put another way, he is trying to apply the scientific method as practiced by academics to the development of complex systems and it doesn't apply.

The scientific method drives broad theoretical understanding through investigation and identification of facts (in the context of those theories). Whilst there are theoretical constructs that need to be elaborated concerning complex systems, they have no (or almost no) impact of the actual development of complex software systems, which is developed through an almost completely heuristic process.

It may sound counter-intuitive to some people, but as systems get more complex, they necessarily become more heuristic, becuase fewer and fewer people understand the system in its entireity. In fact, most modern software systems have already passed the point where any one person can understand them.

lotp is right in that engineered systems built on solid systems theories could exist (assuming the theories existed). However, in practice they don't exist.
Posted by: phil_b   2006-03-02 17:00  

#15  I worked for LSI Logic (formerly Symbios, Symbios Logic, NCR Microelectronics) as a software test engineering technician, running a software test team. Our basic function was to test software compatibility between a dozen or so different software manufacturers and our own in-house SCSI (Small Computer Systems Interface) hardware and software. Basically, we tested our interface hardware and supporting software with Microsoft (Windows 3.X/95/98/2000/ME/NT), OS/2, Unix, SCO, Linux, and a half-dozen other minor operating systems, network software and hardware, peripheral devices, and applications software packages. We also used as wide a variety of hardware as possible - different processors, different chipsets, different manufacturer labels, etc. There are hundreds of ways to increase performance, both internally and externally, that the computer industry can make, if it's willing. One of the first things the entire industry needs to learn is that "one size fits all" doesn't satisfy anybody. I use three different spreadsheets because there are things I can do with one that the other two won't do - mostly things the spreadsheet designer never considered. I use two different word processors and FOUR internet browsers, all for the same reason.

The biggest problem "software researchers" have is their own narrow-mindedness. They never expect people to use their product except in the way they designed them to work, while the user-public wants things that make THEIR life easier. Business users want one thing, home users want something else, and professionals want still something different. Even in these rather loose divisions, there are layers - experienced users, semi-experienced, and total novices. Yet software is designed to meet a "general" user's needs. It doesn't work. Until the "software industry" learns that it needs to create products that address the different needs of each subset of its clients, it will continue to meet very few of the needs of anyone.
Posted by: Old Patriot   2006-03-02 16:51  

#14  Well said Alan.
Posted by: Visitor   2006-03-02 16:50  

#13  This is a funny discussion. I've been in IT for 3 decades and have been around the barn a few times. This sounds like the typical bit-head that can't quite make the connection between real people doing real work, and all the pretty technical bells and whistles that float his boat. (how's that for mixing a metaphor?)

I've worked on PCs where it was 24k of memory and the big thing (literally) in floppies was 8" and 1 mb. I worked on a PC that could hot swap between the two competing OSs DOS and CP/M; and on and on.

You know what? Not one user gave a damn. They wanted a computer to be like a fork. You picked up it does its job....end of discussion.

Ivory tower types have their place, just not in public. ;^)
Posted by: AlanC   2006-03-02 16:44  

#12  Even into the 1980s, much systems work revolved around new architectures (RISC, iAPX/432, Lisp Machines).

In the 1986-1988 timeframe DOD put together a panel of academics and industry people and asked them to recommend the level at which they should set standards for computing. Should they pick a CPU architecture? A language and CPU? or ???

DARPA was pushing the MIPS, Inc. RISC chip, which they had funded heavily, and envisioned virtual machines on top of the chip for various applications. Fairchild was pushing a CISC chip, but they had just got themselves sold to Japan.

The eventual recommendation was to standardize on the high level language (Ada) and let the rest all change underneath it, since so many advances were occuring in chip technologies. DARPA was pretty disappointed that a RISC chip with a virtual machine on top wasn't the choice, but those of us who were doing things in avionics, space based systems etc. breathed a sigh of relief....

And meanwhile, digital signal processors were making incredible strides and many systems now have both standard CPUs and DSPs as needed.
Posted by: lotp   2006-03-02 16:40  

#11  With the multi-cpu cores, cell processors single computers will be turned into clusters that are part of networked clusters and aspects of his orginal Plan-9 operating system will start migrating into the real world. Perhaps his research was ahead of the hardware?
BTW... the Plan 9 GUI is really sad.

Posted by: 3dc   2006-03-02 16:38  

#10  Only two of those -- POSIX and TCP/IP -- are operating-system issues. The rest are applications. He's talking out of his ass, here.

I'd disagree a little on that point, RC. What he's bemoaning is that the infrastructure for large-scale systems consists of these technologies being glued together, rather than engineered-from-the-ground-up systems software.

Much as I'm critical of him in my comment above, he's not entirely wrong. DOD and others are having to invest huge sums of money and manpower to figure out how to secure this mess of stuff, what requirements to set for procuring new (secure) systems etc. Corporations are spending huge amounts of money on the data and apps side too. If there were breakthroughs in systems software designs, it conceivably could make a huge difference.

I just think he was out of touch with what WAS going on in the research at the very time he was speaking. They haven't gelled together like the technologies that really launched the Web yet, tho, so he doesn't see them.
Posted by: lotp   2006-03-02 16:17  

#9  Working in a mature technology isn't much fun. This guy should think about a career change. English? They're into deconstruction.
Posted by: Nimble Spemble   2006-03-02 16:15  

#8  (LinuxÂ’s interface isnÂ’t even as good as Windows!)

Pure BS. I'd rather use vi than 90% of the editors on Windows.

Heck, I often use vi on Windows.
Posted by: Robert Crawford   2006-03-02 16:10  

#7  Hardware has changed dramatically; software is stagnant.

What freakin' planet is he living on?

If you spend your life inside the theory of operating systems, OK, things probably look stagnant. Out here in the Real World, where we're solving Real Problems for Real People, new techniques and new ideas are coming faster than we can absorb them.

This'll probably sound odd, but the dotcom bust forced things to mature, and that opened up a hell of a lot of opportunities. What this guy's doing is the software equivalent of claiming architecture ended when the arch came along.

Besides, LinuxÂ’s cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure.

The process of creating software is as critical as the design of the software. Maybe moreso. This sounds like a snob sniffing that that kind of work isn't done by proper gentlemen!

8 What is Systems Research these days?
Web caches, web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even userlevel applications.


Ya know why? Because that's where the problems are today.

You have an intra-network serving 5,000+ locations. Each location needs access to applications that are run in central locations, and any downtime stops money coming into your company; a project that's coming soon will have servers at each store communicating with the other sites as well.

How do you make that work well? What's the best architecture for serving the applications? For the network?

What are the implications of allowing the public some limited access to resources on those networks?

This guy just doesn't like that his chosen specialization has reached a plateau. His name's familiar, though I can't remember where from.

(Oh, and he's wrong about big changes in hardware. The essential architecture is largely the same, what has changed is the process of manufacturing the hardware, which allowed for higher speeds. Heck, outside the CPU, bus speeds are still below 100MHz.)

To be a viable computer system, one must honor a huge list of
large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you donÂ’t honor the standards youÂ’re marginalized. Estimate that 90-95% of the work in Plan X was directly or indirectly to honor externally imposed standards.


Only two of those -- POSIX and TCP/IP -- are operating-system issues. The rest are applications. He's talking out of his ass, here.
Posted by: Robert Crawford   2006-03-02 16:08  

#6  What SPoD said.
Posted by: lotp   2006-03-02 16:05  

#5  Fred's and my in-line comments stepped on each other, but here's my take in summary. My perspective? I'm a researcher in Artificial Intelligence now, but was a practitioner for 25 years before coming over to the academic side.

DOD paid for the development of TCP/IP and part of the MULTICS project from which UNIX spun off. For over a decade they remained mostly research environments, as academics played with them and DOD used them, learning how to exploit them well. When telecomms and chip technology were ready, they exploded and spawned 15 years of applications technologies including the Web.

Even when this was written, in 2000, there was lots of long-term research in progress on most of the areas Pike mentions.

Operating environments? Consider functional languages like Haskell, which Nokia has been using in its cell and smart phones for several years now. As we learn more about scalability, expect functional environments to replace standard operating systems in more and more places.

Component software? Pike missed the fact that those standards he bemoans have resulted in enterprise application integration. An insurance company client of mine in 1999-2000, the time of this talk, had already put their entire corporation on an integrated system in which whole application programs and databases were replaceable components. Not only the technologies, but even business processes have been changed out seamlessly since then.

GUIs? Check out the research in 2000 on adaptive user interfaces at this DOD project site. Here's another site, a bit heavy on the comp sci and decision theory and AI jargon, but the screen shots give you an idea of the kinds of things that are already working in research environments. He's right about Microsoft, tho, since they've been publishing about their work in adaptive interfaces since 1999, if not earlier.

Can't make money on software? Tell that to the guys in Richmond. Or talk to me - I've taken successful hardware and software products into niche markets quite profitably.

Bottom line: this talk reads like the lament of a guy who no longer gets to have plush 5 year budgets and who is out of touch with a technical and business environment in which his company no longer gets to set the technical terms of play. That's why Nokia is out there with Haskell-based devices and both Lucent and Bell Labs are struggling.
Posted by: lotp   2006-03-02 16:00  

#4  The issue is that the OS has become almost irrelevant and will be completely irrelevent in another generation or so. It is just not where the action is anymore because the OS has solved the problems that it was created for. Time to move on, until a major paradigm shift occurs. Paradigm shifts can not be predicted nor engineered, they just happen, Usually by accident. I don't think people are doing major research on how to make a better paper clip anymore either.
Posted by: Dave   2006-03-02 15:43  

#3  Good Comment Fred.

As the not so typical end user I have my .02¢

Most people expect devices that have processors to function like a toaster. They just have to work. .


As someone who starterd out with CPM on the "personal computer" and Using termials on mainframes we have come a heck of a long way.

He is wrong about Linux, it is not a clone. It has glue that lets it act like UNIX but it leaves UNIX in the dust. It's kernel is changing on a daily basis. UNIX is ossified.

Microsoft isn't a "problem". End user expectations are the problem. Mircosoft has a monopoly in peoples minds to some extent. My wife uses OS X at work and Mandriva 2006 Linux on her computer at home. It took almost zero time for her to adapt to not having a Microsoft operating system when her Windows computer died. All the Applications she used on her Windows computer are on her Linux computer. In some respects her computing experince has improved by the switch to a Linux computing system. But it really doesn't matter. It's just a tool. You use the system that works for you. If it wasn't working for her I would know it. I would go get a HP or Dell PC buy and load the apps she wanted and that would be that.

Microsoft is just to expensive for me to run. Free Software suits my needs better. I don't have thousands of dollars invested in applications. But if I need a Microsoft operating system and application I will buy it. It's been a long time since I needed one. The days of me having a dual booting system are just about over. I have an empty 300 gig drive here waiting for an LINUX OS upgrade when I get in the mood I will not be buy a copy of XP and making a MS file system partition on it. I just don't need a Microsoft OS any more.

My word for him, AT&T Labs are history, get over it.
Posted by: SPoD   2006-03-02 15:42  

#2  3. Take an existing thing and tweak it.

This is how new discovery happens. Weed cutting with nylon string would have been laughed at in 1960.....
Posted by: Visitor   2006-03-02 15:30  

#1  Good arguments Fred.
Posted by: 3dc   2006-03-02 15:01  

00:00