Dec 99 Viewpoint
Volume Number: 15 (1999)
Issue Number: 12
Column Tag: Viewpoint
Viewpoint
by Kas Thomas
In the Mac world, it's taken as axiomatic that the Mac user experience is better - in every way that counts - than the Windows user experience, and that the Mac, as a platform, offers performance that is all-around superior to Wintel offerings. There is some truth, of course, to the latter statement. But, the former proposition (that the Mac user experience is superior to the next-best alternative) needs critical reexamination.
It's time for us to stand back and ask ourselves how much better the Mac user experience really is. If we're honest with ourselves, we'll say: "Not much."
A visitor from Mars who landed on earth right now would take a look at a Mac computer and a Wintel machine and pronounce them pretty much the same. They both show colorful icons on a desktop. They both coax weird noises out of little speakers. And they both have bloated operating systems that serve up frighteningly arcane error messages from time to time or allow the machine to freeze up altogether.
The user experience offered by a Mac was, at one time, significantly different from - and better than - the computing experience offered by The Other Guys. But, in the past decade, dedifferentiation has caught up with the Mac OS. Apple has missed some important opportunities to maintain its lead in OS design, and it's starting to hurt.
One area where the Mac could clearly redefine the computing experience is in "instant-on" booting. The Mac now goes a step in this direction by offering an enhanced "sleep" mode for desktop machines, to give non-PowerBook users the illusion of an instant-on/no-wait boot process. All well and good. Now Apple should go a step further and try to reduce - not by seconds, but by minutes - the time required for a cold boot of a worst-case machine (i.e., one loaded with the most popular four or five dozen extensions and drivers). Has no one in Cupertino found it ironic that one of the hottest shareware System Extensions on the World Wide Web right now is Marc Moini's Startup Doubler, which cuts Mac cold-boot times roughly in half?
The fact that Startup Doubler has gone double-platinum is telling. It means Apple has work to do.
A more important question, perhaps, is why anyone should ever have to reboot a machine in the first place. An operating system that lets a process (or a user) get so out of control that the host machine crashes, freezes, or goes off into deep space is - some would say - not much of an operating system. There are computer users in the Unix world who literally can't remember the last time their machines locked up, because it's such a rare event. And yet Apple still ships boxes that (let's be honest) regularly lock up or zone out. Isn't this (once again) genuinely Windows-like behavior? Does Apple really want to be associated with this sort of thing any longer?
But, even if we grant that occasionally - however rarely - a machine is bound to encounter a deep error condition of some kind, does that really mean we should have to reboot the machine? Shouldn't there be some kind of sophisticated shadow-caching of the machine's overall state so that when a supposedly unrecoverable condition is entered, the user can back out of it by escaping to a previously known-good state? What's so impossible about that?
Snappier overall performance at the OS level is another issue that needs attention. The Mac OS has gotten sluggish over the years, a trend that began in 1990 with System 7. (Again, not coincidentally, this trend has floated a flotilla of bestselling Bandaidware utilities with "Doubler" in the name.) Much of the latency involved in launching Finder processes is, of course, due to hard disk action. Here's another area where Steve could rock people's perceptions - by getting rid of built-in hard disks.
Everything that's done now on hard disks can be done much faster on virtual (RAM) disks. For data protection, RAMdisk contents would be continuously (and automatically) written to the Internet (i.e., to personal or corporate FTP sites). This would require a constant 24/7 net hookup, of course. Those who don't yet have such amenities could, of course, buy inexpensive backup storage in the form of CD-R, DVD, or (yes) old-fashioned IDE drives.
The only legit technical objection, it seems to me, to using DRAM as default storage would be cost. But, think what a unilateral decision by Apple to use DRAM for data storage would do to DRAM prices! (Can anyone say 'USB'?)
Soon it may actually be feasible just to write one's backup data to a Cisco backbone, a "ping network," or a satellite uplink. Not to any given medium, mind you - not to the nodes at either end of the backbone, but to the backbone itself. Not to the satellite, but to the space between the earth and the satellite. Imagine that you could fire data skyward at a rate of 100 megahertz (which is certainly feasible; that's in the low FM band). Suppose there is a one-second roundtrip time to the satellite. If you were to write continuously to the sky, there would always be 100 megabits of your data in transit - 100 megabits in storage. To retrieve any part of it, just wait for the part you want, and grab it. (The reader is left to prove that the average "seek time" would be 500 msec.)
Apple has done a good job over the years - maybe a bit too good - of "polishing pixels" in the ever-precious Mac operating system, even going to the trouble of adding a Pixel Polish Manager (I believe the politically correct name is Appearance Manager) to the OS. No question about it, we have the shiniest pixels.
But, it's time to recognize that meaningful progress in operating systems no longer has the slightest thing to do with widget design, dialog appearances, contextual menus, personalization, Login by Voice Recognition, or (God help us) balloon help.
What the world could use right now is a personal supercomputer that comes on like a light bulb, has zero hard-disk latency, and never locks up.
Let's hope the Other Guys don't do it first.