Advertisement

Entelligence: Stains on the sleeve of my operating system

Entelligence is a column by technology strategist and author Michael Gartenberg, a man whose desire for a delicious cup of coffee and a quality New York bagel is dwarfed only by his passion for tech. In these articles, he'll explore where our industry is and where it's going -- on both micro and macro levels -- with the unique wit and insight only he can provide.

I originally started this column on my take on what an Apple tablet might be (I literally dreamed about it and started to write it down when I woke up). I was really into it, which explains why I didn't save it as I wrote. I think you can see where this is going.

Like a cartoon character who notices that he's no longer standing on solid ground and suddenly begins to fall, I reached over to save, but was too late. My trusty XP install suddenly blue screened. Muttering just a few choice words, I rebooted, only to blue screen again. No problem, there's always "safe mode." Too bad safe mode blue screened as well. With little hope of getting anything recovered, I gave up, fired up my Mac and started from scratch. It's not the first time this has happened to me, where for some reason or another I've lost work on my computer. I suspect it's happened to a few of you out there too.

But this latest bad experience changed my thought process from Apple tablets to what's wrong with the whole PC landscape and today's operating systems.


On the eve of the Windows 7 and Snow Leopard launches (which respectively appear to be the best efforts yet from Microsoft and Apple), the basic platforms we all use remain products that are largely unchanged from concepts introduced decades ago, regardless of all their speed enhancements and evolutionary changes,. It doesn't matter if you're running some flavor of Windows, UNIX, Mac OS X or Linux -- this isn't about OS wars. Every modern PC and operating system retains the flaws of initial product designs that go way back to a time when compromises were needed because system resources were scarce. It's a little like the urban legend about a garment vendor who sent a designer coat overseas to be copied, only to get back 10,000 coats with an identical stain on the sleeve. Yes, PCs initially had to be told to save and it would have been ludicrous to try and track revisions on an 88KB floppy disk. But all that should be history by now.

There have actually been some pretty interesting attempts in the past, at least conceptually, to change things. PenPoint from Go used a unique set of system wide gestures to control the device and had an architecture to allow for documents nested within documents that could be zoomed in and out. Apple's Newton OS was one of the first operating systems to save everything created by default. The SwyftCard created by the late Jeff Raskin could take an Apple IIe and go from cold boot to where you left off in six seconds. (Hit the image for the full-size ad.)

Isn't it past time for hierarchal file folders that attempt to re-recreate my filing cabinet? Even with universal search, there has to be a better way to store and retrieve information. Why, for example, does almost every program force users to save their work? Going from CTRL-KD as the "save to" command in WordStar to Ctrl-S isn't much of an achievement. Why can't we build on a real-world model that keeps everything I create by default, and only throws away changes? You can even extend the metaphor and keep every revision, since modern hard drives are hundreds of gigabytes in size. Yet with release after release, vendors add features that no one uses or cares about -- a problem that reached a nadir with the Microsoft Office Assistant, which was so universally reviled that Microsoft actually sent out press releases announcing Clippy's death when it was removed.

Yes, I know there are settings to allow for things like auto-save, but why are the default settings for applications often designed to punish users who don't follow the programmers' rules or understand the interface? Don't get me wrong -- we're making progress, but most of that progress has been restricted to mobile platforms. webOS, iPhone OS and Android all have driven some degree of innovation beyond the PC desktop. Even apps are tracking that way: Tweetie, one of the most popular Twitter apps, debuted for the iPhone before it was available for Mac OS. Now we need to bring some OS-level innovations back to the PC desktop and take personal computing to the next level as well.

One reason no one has been able to unseat Microsoft from desktop operating system dominance is that no one has offered a fundamentally different product that actually changes some of these longstanding issues. Although there are arguably better products out there, the stains on the sleeve are still there, and it's time for some revolutionary innovation that will truly change the computing experience. What do you think -- are today's platforms really revolutionary or are there still too many throwbacks to ideas that are thirty years old?


Michael Gartenberg is vice president of strategy and analysis at Interpret, LLC. His weblog can be found at gartenblog.net, and he can be emailed at gartenberg AT gmail DOT com. Views expressed here are his own.