Advertisement

Editorial: The future comes slowly, but revolutions are worth waiting for

During a trip to Switzerland, my family started off on a day hike to reach the nearby foothills of a mountain. It looked doable, but as time passed the range seemed to recede before our approach. After many hours we turned around, having apparently failed to close any distance.

Crossing from now to the future in technology can likewise seem illusory. When we scrutinize and celebrate each tiny incremental invention as if it were a milestone, we lose track of time as if we were counting grains of sand dropping through an hourglass. Game-changing inventions are rare, separated by epochs in which progress adds up to a lot of sameness. Futurism is an unforgiving business. But sometimes, as in the cases of cloud computing and media convergence, redemption comes with patience.

Convergence
Editorial The future comes slowly, but revolutions are worth waiting for

In the 1990s, putting the internet on TV was a conceptual holy grail of technology mash-ups. The prediction was grounded in the obvious: both realms expressed themselves on screens. Bash the two screens into one screen, merge the two experiences, and you're in a gleaming new commerce-optimized future. Visionaries of this future imagined television shows enhanced with clickable product placements that would allow viewers to instantly purchase a pair of pants worn by their favorite sitcom character. It's not surprising that many convergence dreams are spun from commercial instincts, the idea that users want to buy merchandise at every hour of the day being a particularly cynical one.

The vision was also afflicted by key misunderstandings of consumer demand. Believers minimized or ignored the profound lifestyle conflicts between computer use and television use: pulled content vs. pushed content; leaning forward vs. sitting back; desk vs. couch; keyboard vs. remote; information vs. entertainment; home office vs. living room.

A more effective and benignly motivated roadmap for putting internet content on TV screens was WebTV, established by Steve Perlman in 1996. WebTV was the most attention-getting product launch of that year. (I was WebTV's media spokesperson before and during the launch.) Created for users who didn't own computers, either because of fear or economics, WebTV displayed email and specially formatted views of web sites on televisions hooked up to a set-top box. WebTV was acquired by Microsoft in 1997, and re-branded MSN TV. It served 800,000 subscribers after three years of operation, and is still available today.

MSN TV aside, the premise of blending the internet with television languished for years. Its recent revival has been powered by two developments were unforeseen in the 90s. First, in a reversal of previous conceptions, television programming came to the computer screen via Hulu, network websites, Netflix, and other online distribution points. Media owners became newly willing to separate programming from both the broadcast schedule and the living room television screen, an accommodation kilned in the heat created by DVRs, P2P sharing, and YouTube.

Editorial The future comes slowly, but revolutions are worth waiting for

The second factor is the creation of WiFi-connected boxes which inject television shows back into the television from the internet. ("Still watching Netflix on your laptop? With Roku, you can stream it right to your TV. Where it was meant to be.") This ironic circularity is the rotary engine of TV / internet convergence. If systems like Roku, Apple TV, Boxee, and Google TV seem to resemble MSN TV, the usage scenarios are opposite. The new devices are entertainment systems wedged into the home theater; MSN TV is still an email / browser alternative to traditional computing. And when the new boxes are reduced to operating systems, the TV / internet union is sealed shut, contained within an internet-enabled television set.

Why did it take nearly 20 years? Partly because the early visionaries overvalued the relatively new computing experience and undervalued the old-school couch potato experience. It's also worth remembering that watching TV shows via Hulu on your TV is not the last word in convergence of media realms. Technology and user demand have created new distribution paths; now creative resources are treading those paths. As the dividing line between TV shows and web shows is eroded, a deeper and truer media convergence will be accomplished. Ultimately there will no such thing as an unconnected screen. In that golden era the source of content, now a stigmatic identifier, will be meaningless.

Cloud computing
Editorial The future comes slowly, but revolutions are worth waiting for

If all screens become equal, and all are networked, what are they connected to? The internet is fundamentally bifocal, a client-server system. Our local devices -- computers, handhelds, game boxes -- serve a hinging role between the global network and individuals. They are clients looking upward and servers looking downward. When we store music on a computer hard drive, that computer is a tune server. When we store our songs in the cloud, either by uploading files or subscribing to a service, our computers and mobile devices are equal-footed music terminals.

By definition, servers are smarter than clients, because they have the content. Terminals are dumb. The dumber, the better in a system optimized for mobility. Just as the perfect television is a simple screen connected to the internet's brain, so for computers. That's the core idea of the Chromebook, a fast and cheap screen running a browser and lightweight OS optimized for a connected suite of services.

The famous Sun Microsystems motto, "The network is the computer," was much repeated during the web's infancy in the 90s. As an anticipation of the future, it contained two parts: web services (for productive computing) and storage (to replace local drives). Online pros projected what is now cloud computing during an era when a typical Windows machine contained a one-megabyte hard drive, and processing speeds were painful. The internet was aboriginal too: small, undeveloped, and running primitive markup code. The race was on. At stake: where would the brain live? Dispersed among millions of terminals, or amorphously centralized in the network?

Editorial The future comes slowly, but revolutions are worth waiting for

The balance of that race has been shifted by mobile. It's not only a question of whether the computer is a better local server than the internet. When normal individuals own many personal screens, they want random access of their content and services equally across them all, as if the screens were a multiplexed terminal attached to one computer. The network is the computer. That's where the brain should be for a mobile, connected citizenry.

It has taken a long time for cloud computing to get off the ground, and there is a long way to go. Low-cost, universal, cordless mobile connectivity would help, but political roadblocks are the most persistent. Development and adoption of application web services have been slow. How many companies are using Office 365? (Not mine.) The vast corporate footprint of legacy computer brain power anchors the potential loft of cloud computing.

We dropped a robot onto Mars last week, so it is churlish to complain of being mired in the present. The future, where grass is always greener, attracts us to its gleaming fields. The wait, with its complications and twisting pathways, can feel interminable.


Brad Hill is the VP, Audience Development at AOL. He is the former Director and General Manager of Weblogs, Inc.