Archived material Access restricted Article
Rantburg

Today's Front Page   View All of Thu 03/02/2006 View Wed 03/01/2006 View Tue 02/28/2006 View Mon 02/27/2006 View Sun 02/26/2006 View Sat 02/25/2006 View Fri 02/24/2006
1
2006-03-02 Science & Technology
Systems Software Research is Irrelevant
Archived material is restricted to Rantburg regulars and members. If you need access email fred.pruitt=at=gmail.com with your nick to be added to the members list. There is no charge to join Rantburg as a member.
Posted by Fred 2006-03-02 13:33|| || Front Page|| [336099 views since 2007-05-07]  Top

#1 Good arguments Fred.
Posted by 3dc 2006-03-02 15:01||   2006-03-02 15:01|| Front Page Top

#2 3. Take an existing thing and tweak it.

This is how new discovery happens. Weed cutting with nylon string would have been laughed at in 1960.....
Posted by Visitor 2006-03-02 15:30||   2006-03-02 15:30|| Front Page Top

#3 Good Comment Fred.

As the not so typical end user I have my .02¢

Most people expect devices that have processors to function like a toaster. They just have to work. .


As someone who starterd out with CPM on the "personal computer" and Using termials on mainframes we have come a heck of a long way.

He is wrong about Linux, it is not a clone. It has glue that lets it act like UNIX but it leaves UNIX in the dust. It's kernel is changing on a daily basis. UNIX is ossified.

Microsoft isn't a "problem". End user expectations are the problem. Mircosoft has a monopoly in peoples minds to some extent. My wife uses OS X at work and Mandriva 2006 Linux on her computer at home. It took almost zero time for her to adapt to not having a Microsoft operating system when her Windows computer died. All the Applications she used on her Windows computer are on her Linux computer. In some respects her computing experince has improved by the switch to a Linux computing system. But it really doesn't matter. It's just a tool. You use the system that works for you. If it wasn't working for her I would know it. I would go get a HP or Dell PC buy and load the apps she wanted and that would be that.

Microsoft is just to expensive for me to run. Free Software suits my needs better. I don't have thousands of dollars invested in applications. But if I need a Microsoft operating system and application I will buy it. It's been a long time since I needed one. The days of me having a dual booting system are just about over. I have an empty 300 gig drive here waiting for an LINUX OS upgrade when I get in the mood I will not be buy a copy of XP and making a MS file system partition on it. I just don't need a Microsoft OS any more.

My word for him, AT&T Labs are history, get over it.
Posted by SPoD 2006-03-02 15:42|| http://sockpuppetofdoom.blogspot.com/]">[http://sockpuppetofdoom.blogspot.com/]  2006-03-02 15:42|| Front Page Top

#4 The issue is that the OS has become almost irrelevant and will be completely irrelevent in another generation or so. It is just not where the action is anymore because the OS has solved the problems that it was created for. Time to move on, until a major paradigm shift occurs. Paradigm shifts can not be predicted nor engineered, they just happen, Usually by accident. I don't think people are doing major research on how to make a better paper clip anymore either.
Posted by Dave 2006-03-02 15:43||   2006-03-02 15:43|| Front Page Top

#5 Fred's and my in-line comments stepped on each other, but here's my take in summary. My perspective? I'm a researcher in Artificial Intelligence now, but was a practitioner for 25 years before coming over to the academic side.

DOD paid for the development of TCP/IP and part of the MULTICS project from which UNIX spun off. For over a decade they remained mostly research environments, as academics played with them and DOD used them, learning how to exploit them well. When telecomms and chip technology were ready, they exploded and spawned 15 years of applications technologies including the Web.

Even when this was written, in 2000, there was lots of long-term research in progress on most of the areas Pike mentions.

Operating environments? Consider functional languages like Haskell, which Nokia has been using in its cell and smart phones for several years now. As we learn more about scalability, expect functional environments to replace standard operating systems in more and more places.

Component software? Pike missed the fact that those standards he bemoans have resulted in enterprise application integration. An insurance company client of mine in 1999-2000, the time of this talk, had already put their entire corporation on an integrated system in which whole application programs and databases were replaceable components. Not only the technologies, but even business processes have been changed out seamlessly since then.

GUIs? Check out the research in 2000 on adaptive user interfaces at this DOD project site. Here's another site, a bit heavy on the comp sci and decision theory and AI jargon, but the screen shots give you an idea of the kinds of things that are already working in research environments. He's right about Microsoft, tho, since they've been publishing about their work in adaptive interfaces since 1999, if not earlier.

Can't make money on software? Tell that to the guys in Richmond. Or talk to me - I've taken successful hardware and software products into niche markets quite profitably.

Bottom line: this talk reads like the lament of a guy who no longer gets to have plush 5 year budgets and who is out of touch with a technical and business environment in which his company no longer gets to set the technical terms of play. That's why Nokia is out there with Haskell-based devices and both Lucent and Bell Labs are struggling.
Posted by lotp 2006-03-02 16:00||   2006-03-02 16:00|| Front Page Top

#6 What SPoD said.
Posted by lotp 2006-03-02 16:05||   2006-03-02 16:05|| Front Page Top

#7 Hardware has changed dramatically; software is stagnant.

What freakin' planet is he living on?

If you spend your life inside the theory of operating systems, OK, things probably look stagnant. Out here in the Real World, where we're solving Real Problems for Real People, new techniques and new ideas are coming faster than we can absorb them.

This'll probably sound odd, but the dotcom bust forced things to mature, and that opened up a hell of a lot of opportunities. What this guy's doing is the software equivalent of claiming architecture ended when the arch came along.

Besides, Linux’s cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure.

The process of creating software is as critical as the design of the software. Maybe moreso. This sounds like a snob sniffing that that kind of work isn't done by proper gentlemen!

8 What is Systems Research these days?
Web caches, web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even userlevel applications.


Ya know why? Because that's where the problems are today.

You have an intra-network serving 5,000+ locations. Each location needs access to applications that are run in central locations, and any downtime stops money coming into your company; a project that's coming soon will have servers at each store communicating with the other sites as well.

How do you make that work well? What's the best architecture for serving the applications? For the network?

What are the implications of allowing the public some limited access to resources on those networks?

This guy just doesn't like that his chosen specialization has reached a plateau. His name's familiar, though I can't remember where from.

(Oh, and he's wrong about big changes in hardware. The essential architecture is largely the same, what has changed is the process of manufacturing the hardware, which allowed for higher speeds. Heck, outside the CPU, bus speeds are still below 100MHz.)

To be a viable computer system, one must honor a huge list of
large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you don’t honor the standards you’re marginalized. Estimate that 90-95% of the work in Plan X was directly or indirectly to honor externally imposed standards.


Only two of those -- POSIX and TCP/IP -- are operating-system issues. The rest are applications. He's talking out of his ass, here.
Posted by Robert Crawford">Robert Crawford  2006-03-02 16:08|| http://www.kloognome.com/]">[http://www.kloognome.com/]  2006-03-02 16:08|| Front Page Top

#8 (Linux’s interface isn’t even as good as Windows!)

Pure BS. I'd rather use vi than 90% of the editors on Windows.

Heck, I often use vi on Windows.
Posted by Robert Crawford">Robert Crawford  2006-03-02 16:10|| http://www.kloognome.com/]">[http://www.kloognome.com/]  2006-03-02 16:10|| Front Page Top

#9 Working in a mature technology isn't much fun. This guy should think about a career change. English? They're into deconstruction.
Posted by Nimble Spemble 2006-03-02 16:15||   2006-03-02 16:15|| Front Page Top

#10 Only two of those -- POSIX and TCP/IP -- are operating-system issues. The rest are applications. He's talking out of his ass, here.

I'd disagree a little on that point, RC. What he's bemoaning is that the infrastructure for large-scale systems consists of these technologies being glued together, rather than engineered-from-the-ground-up systems software.

Much as I'm critical of him in my comment above, he's not entirely wrong. DOD and others are having to invest huge sums of money and manpower to figure out how to secure this mess of stuff, what requirements to set for procuring new (secure) systems etc. Corporations are spending huge amounts of money on the data and apps side too. If there were breakthroughs in systems software designs, it conceivably could make a huge difference.

I just think he was out of touch with what WAS going on in the research at the very time he was speaking. They haven't gelled together like the technologies that really launched the Web yet, tho, so he doesn't see them.
Posted by lotp 2006-03-02 16:17||   2006-03-02 16:17|| Front Page Top

#11 With the multi-cpu cores, cell processors single computers will be turned into clusters that are part of networked clusters and aspects of his orginal Plan-9 operating system will start migrating into the real world. Perhaps his research was ahead of the hardware?
BTW... the Plan 9 GUI is really sad.

Posted by 3dc 2006-03-02 16:38||   2006-03-02 16:38|| Front Page Top

#12 Even into the 1980s, much systems work revolved around new architectures (RISC, iAPX/432, Lisp Machines).

In the 1986-1988 timeframe DOD put together a panel of academics and industry people and asked them to recommend the level at which they should set standards for computing. Should they pick a CPU architecture? A language and CPU? or ???

DARPA was pushing the MIPS, Inc. RISC chip, which they had funded heavily, and envisioned virtual machines on top of the chip for various applications. Fairchild was pushing a CISC chip, but they had just got themselves sold to Japan.

The eventual recommendation was to standardize on the high level language (Ada) and let the rest all change underneath it, since so many advances were occuring in chip technologies. DARPA was pretty disappointed that a RISC chip with a virtual machine on top wasn't the choice, but those of us who were doing things in avionics, space based systems etc. breathed a sigh of relief....

And meanwhile, digital signal processors were making incredible strides and many systems now have both standard CPUs and DSPs as needed.
Posted by lotp 2006-03-02 16:40||   2006-03-02 16:40|| Front Page Top

#13 This is a funny discussion. I've been in IT for 3 decades and have been around the barn a few times. This sounds like the typical bit-head that can't quite make the connection between real people doing real work, and all the pretty technical bells and whistles that float his boat. (how's that for mixing a metaphor?)

I've worked on PCs where it was 24k of memory and the big thing (literally) in floppies was 8" and 1 mb. I worked on a PC that could hot swap between the two competing OSs DOS and CP/M; and on and on.

You know what? Not one user gave a damn. They wanted a computer to be like a fork. You picked up it does its job....end of discussion.

Ivory tower types have their place, just not in public. ;^)
Posted by AlanC">AlanC  2006-03-02 16:44||   2006-03-02 16:44|| Front Page Top

#14 Well said Alan.
Posted by Visitor 2006-03-02 16:50||   2006-03-02 16:50|| Front Page Top

#15 I worked for LSI Logic (formerly Symbios, Symbios Logic, NCR Microelectronics) as a software test engineering technician, running a software test team. Our basic function was to test software compatibility between a dozen or so different software manufacturers and our own in-house SCSI (Small Computer Systems Interface) hardware and software. Basically, we tested our interface hardware and supporting software with Microsoft (Windows 3.X/95/98/2000/ME/NT), OS/2, Unix, SCO, Linux, and a half-dozen other minor operating systems, network software and hardware, peripheral devices, and applications software packages. We also used as wide a variety of hardware as possible - different processors, different chipsets, different manufacturer labels, etc. There are hundreds of ways to increase performance, both internally and externally, that the computer industry can make, if it's willing. One of the first things the entire industry needs to learn is that "one size fits all" doesn't satisfy anybody. I use three different spreadsheets because there are things I can do with one that the other two won't do - mostly things the spreadsheet designer never considered. I use two different word processors and FOUR internet browsers, all for the same reason.

The biggest problem "software researchers" have is their own narrow-mindedness. They never expect people to use their product except in the way they designed them to work, while the user-public wants things that make THEIR life easier. Business users want one thing, home users want something else, and professionals want still something different. Even in these rather loose divisions, there are layers - experienced users, semi-experienced, and total novices. Yet software is designed to meet a "general" user's needs. It doesn't work. Until the "software industry" learns that it needs to create products that address the different needs of each subset of its clients, it will continue to meet very few of the needs of anyone.
Posted by Old Patriot">Old Patriot  2006-03-02 16:51|| http://oldpatriot.blogspot.com/]">[http://oldpatriot.blogspot.com/]  2006-03-02 16:51|| Front Page Top

#16 My 2c worth and in my time, I've worked on OSes, compilers and distributed systems.

The problem in a nutshell is 'system research' is an oxymoron. Or put another way, he is trying to apply the scientific method as practiced by academics to the development of complex systems and it doesn't apply.

The scientific method drives broad theoretical understanding through investigation and identification of facts (in the context of those theories). Whilst there are theoretical constructs that need to be elaborated concerning complex systems, they have no (or almost no) impact of the actual development of complex software systems, which is developed through an almost completely heuristic process.

It may sound counter-intuitive to some people, but as systems get more complex, they necessarily become more heuristic, becuase fewer and fewer people understand the system in its entireity. In fact, most modern software systems have already passed the point where any one person can understand them.

lotp is right in that engineered systems built on solid systems theories could exist (assuming the theories existed). However, in practice they don't exist.
Posted by phil_b">phil_b  2006-03-02 17:00|| http://autonomousoperation.blogspot.com/]">[http://autonomousoperation.blogspot.com/]  2006-03-02 17:00|| Front Page Top

#17 Phil_b & Lotp have it just about right...but there's one thing that is very hard for the academic community to grasp.

Software AND THE USE THEREOF have a large subjective / artistic / creative component.

When teaching RDB to students I like to start with a simple physical exercize. I break students into groups and then give each group a set of blocks. There are 3 large square blocks, one red, one blue and one green; then there are 3 medium sized square blocks and then 3 small; there are also round and triangle blocks the same way.

I ask the groups of students to sort the blocks into some rational order. The catch is that there is no correct answer and virtually every group comes up with different answers.

Then I ask them how the user wants them?

The point of the ramble is that there is no way to scientifially solve the "problem".
Posted by AlanC">AlanC  2006-03-02 17:12||   2006-03-02 17:12|| Front Page Top

#18 Yesterday having done both taxes and FAFSA (kids student loan) is a perfect example of what can go wrong with software.

The FASFA site refuses to work with anything but IE or Netscape. (not even Mozilla) On top of that they don't do paper anymore so if the kid wants a student loan... ITS THEIR WAY OR THE HIGHWAY.

The IRS site is a tad better but will not allow you to efile for free unless your gross is < 50K. If its greater you have to pay to use TurboTax or the like. TurboTax implies windows - a M$S requirement to e-file.

So the Web experience on these government sites can make one curse the software. (rightly or wrongly the web presence becomes software in the users eyes.)

On another point... I have XEN 3.0 running on some old hardware downstairs. This recent operating enviroment with true virtualization solves DARPAs ADA vs MIPS complaint Ltop was complaining about... oh the currently running vitrual OSs on that basement junk box? Centos and Debian...
Posted by 3dc 2006-03-02 17:48||   2006-03-02 17:48|| Front Page Top

#19 Lots of expertise and experience here at the Burg!

A couple of points.

3dc, when I ran a group that produced language tools (compilers, link editors, symbolic debuggers with embedded chip simulators) for the military real-time world (avionics, flight control systems, space-based systems) along with a hard real-time OS for embedded systems there was no way on God's green earth those apps could execute successfully in a virtual, emulated environment.

We're talking microseconds for very complex calculations that do things like shoot down missiles or keep fly-by-wire fighters in the air. So unfortunately your box in the basement wouldn't have solved DOD's problem (then or now), although it suits nicely for some other application domains. Which makes the point that there is no such thing as a single solution ....

Alan, I put out a bunch of words, so you might not have noticed that I spent 25 years in the "real world" doing software and managing software engineering (and some hardware design) before I went academic.

I didn't miss the fact that "THE USE (of software has) a large subjective / artistic / creative component." And neither has the academic comp sci world. That's exactly what adaptive user interfaces are all about. When these move into general use, the users will focus on how they want to work and what they want the software to look like.

That said, a good portion of my practitioner career involved "hard" real-time applications, or at least softer real-time process control and/or communications systems. I've written or managed projects ranging from a couple hundred lines of elegant code to a system that had about 4 million lines of high level source code. In some of those systems, 'creativity' was not what we wanted, needed or could tolerate.

Elegant design? Absolutely. Tight code and innovative algorithms? You bet - if they were designed and written so as to make it possible to verify their behavior and validate that they met the requirement spec.

The pilots of planes whose avionics and flight control systems were built with our code were glad we weren't 'creative'. ;-)
Posted by lotp 2006-03-02 18:15||   2006-03-02 18:15|| Front Page Top

#20 I ask the groups of students to sort the blocks into some rational order. The catch is that there is no correct answer and virtually every group comes up with different answers.

Then I ask them how the user wants them?

The point of the ramble is that there is no way to scientifially solve the "problem".


Sure - that's pretty obvious WRT user interfaces.

When you talk about the architecture underneath them, however, there ARE scientific ways to characterize better or poorer approaches. Mai-Hwa Chen's complexity metric for distributed enterprise systems is one.
Posted by lotp 2006-03-02 18:17||   2006-03-02 18:17|| Front Page Top

#21 Never meant to denigrate your experience, lotp. While I never did anything military I have worked with some real time control software for things like cutters and slitters for roll goods. The last few years I've been doing a lot more managing like the current migration of 2 dozen systems with ~50 Tb of data from Solaris to AIX.

The point I was trying to make was that there is almost always more than one way to skin the software cat and the choices can rarely if ever be truly said to be scientific beyond a certain point. Elegant is a word more frequently used aesthetically than scientifically ( I had one boss who name me Elegant Alan for my propensity to turn out tight code ).

As far as those pilots of your go, I bet they were quite appreciative of your creativity whether you were or not. ;^)

But, again, the main point is that Mr. Pike doesn't seem to know that the $$$$ are with the users for whom there is such a thing as good enough.

Posted by AlanC">AlanC  2006-03-02 18:41||   2006-03-02 18:41|| Front Page Top

#22 Yup. When real $$ are at stake, users want good enough and not any fancier. Dead right, Alan!
Posted by lotp 2006-03-02 19:31||   2006-03-02 19:31|| Front Page Top

#23 One of the real big revolutions in software is the work on component architectures, the most visible being the Eclipse development environment.
Posted by Ptah">Ptah  2006-03-02 20:45|| http://www.crusaderwarcollege.org]">[http://www.crusaderwarcollege.org]  2006-03-02 20:45|| Front Page Top

#24 My word for him, AT&T Labs are history, get over it.

Heh. Rob Pike now works for Google. As does Guido v. Rossum.

Rob's a great man; he's one of the original developers of Unix. You should read his book, "Practical Programming". It's a marvel of clarity.
Posted by KBK 2006-03-02 23:54||   2006-03-02 23:54|| Front Page Top

23:55 Rafael
23:54 KBK
23:51 Rafael
23:50 twobyfour
23:48 Pappy
23:39 Rafael
23:30 trailing wife
23:29 twobyfour
23:29 JosephMendiola
23:28 Inspector Clueso
23:25 twobyfour
23:15 Skidmark
23:10 3dc
22:52 Frank G
22:45 Barbara Skolaut
22:42 ed
22:23 49 Pan
22:13 trailing wife
22:08 Dreadnought
22:07 xbalanke
22:05 DMFD
22:02 RD
21:45 Pappy
21:40 Captain America









Paypal:
Google
Search WWW Search rantburg.com