Good ideas are hard to predict, both before and after they are introduced as prototypes. The push-button (touch-tone) telephone was conceptually launched to the consumer market at the 1962 Seattle World's Fair, 21 years after the device was invented by Bell Telephone labs. Another 20 years after the Seattle exposition, touch-tone phones finally penetrated 50 percent of American homes. A general lack of tech frenzy, and monopoly pricing control, slowed adoption. But it's also true that the new phones didn't solve a fundamental problem. They sped up dialing, which solved a non-essential but important user-interface problem of rotary dial phones.
Each year at CES, tech enthusiasts get a chance to glimpse prototypical ideas and guess whether they will endure. In doing so, one question should remain central: "What problem is being solved?"
Identifying problems -- or, more precisely, anticipating which problems the marketplace will pay money to solve -- is nearly as difficult as inventing solutions. Plenty of good tech becomes road kill when shipped to market. Will the Samsung T9000 refrigerator with built-in custom tablet acquire traction? It's a worthy idea: build Evernote right into the fridge, the shopping list's point of crisis. But we already have Evernote on our phones, so success depends on the perceived value of fine-tuning convenience. Hard to predict.
Big challenges should lead to big products. Three of the most important avenues leading us into the future of technology are mobile, cloud connectivity and big data. The first two are closely related as consumer concerns, while the third exists primarily in the enterprise market. This year's CES was soft in the realm of mobile innovations, and MWC could make up for that shortcoming next month.
Two broad trends did characterize the strengths of CES 2013. First, the invasion by the fringe into the main arena of product ideas. Spearheading that incursion was the Pebble smart watch, whose hive-raised capital of $10 million is as much a business story, and a futuristic tale of grass-roots venture funding, as it is a tech story. On both counts its utter coolness might supersede the lack of essential problem-solving.
The second trend is old-school: TV. Any facile generalization of CES's complexity is a gloss, but for many dazzled observers CES 2013 was the OLED conference. OLED (Organic Light-Emitting Diode) and 4K (a separate technology that can be paired with OLED) are quickly becoming buzzwords representing the next evolutionary step for high def.
Does a higher level of display beauty solve a problem? If so, cynics and reasonable people of all stripes would assert it is a first-world problem. More immersive visual home entertainment screens certainly aim at satisfying a quest, which is as old as television: to escape from worldly reality into virtual reality. As long as consumers buy into that mission, television display technology will evolve until it feels as real as a lucid dream. We all want a 55-inch OLED screen, but it's hard to imagine adopting it as a necessity the way people do their smartphones, even after the prices enter breathable atmosphere.
Regarding OLED's strange capacity to show multiple program streams on one screen, the jury is way out, gone for the day, left the state. This multi-dimensional feature doesn't pass my personal WTF test, when I observe a world in which families are contentedly fractioned into personal device bubbles. Why would anyone want to sob over Downton Abbey while sitting next to their teenaged, adrenalized Xbox junkie? Putting together that scenario doesn't solve any problem I'm aware of. Multiple screens have long been America's solution to diverse family programming.
In the compass of TV intelligence, prognosis is informed by history. For more than 15 years product imagineers have been questing for television screens to displace computers and regain supremacy in the living room. I've written before about the misguided, insistent belief of TV builders that people want a computer-like interactive relationship with their televisions. The quixotic pursuit of unwanted computer smarts in the TV is both recycling old ideas and introducing more preposterous new ones.
Samsung's Smart Hub bundles social apps and photo display with standard TV content and on-demand programming. Perhaps the Twitter and Facebook functions are included with a "what the hell, might as well" attitude, but doing so poorly solves a (non)-problem that has already been superbly solved with mobile devices. Updating the minutiae of one's life is not happily accomplished on a gigantic screen tethered to the wall.
More nonsensical, Panasonic brings out Swipe and Share, which can pull photos from Android and iOS devices. The system provides a digital pen, for (put down any hot drink you might be holding right now) photo editing. Now listen. I'm willing to be wrong about any projection I make. Being eventually proved wrong is a reasonable price to pay for the satisfaction of calling out antic absurdity, and I'm here to say, as I have been saying for many years, that nobody wants their TV to be a productivity machine.
To emerge from my fevered indictment and end on an upbeat note, I have to like how LG's Magic Remote is evolving. Voice-recognized search for programs or networks is an "Of course!" idea that grows more solution-oriented as the problem of too much content becomes more intense. It's interesting: the original remote control solved a prickly TV-watching problem, which is that you had to get up and approach the thing to change channels. (Again, first world, but still.) As choices multiplied, the remote control's bristling mass of tiny buttons became the problem. LG's remote is a type of smart TV I can believe in.
Brad Hill is a former Vice President at AOL, and the former Director and General Manager of Weblogs, Inc. He talks to his remote, which has no voice recognition.