Bill Atkinson, the developer of HyperCard and MacPaint (among others), spoke about interface design this morning at the Macworld Industry Conference.
Not so long ago, Atkinson noted, users were quite separate from the computing experience. UI design evolved to the now familiar desktop metaphor, then stopped. This is going to change with mobile devices.
Today, we edge closer to integrating the computer with ourselves, Atkinson said. "Pinch and zoom" is only a beginning, as physical contact with content increases, spawning good and bad ideas. For example, augmented reality fills your screen with boxes, and visual goggles suggest future wearable computers.
We're far off from the "floating displays" of Minority Report and Avatar.
Atkinson went on to discuss what he calls a "memory prosthesis," or the idea of a wearable earpiece that communicates with, for instance, an iPhone. The piece could work with a Virtual Personal Assistant that accepts data from the user's conversational speech. In fact, this assistant hears, sees, speaks and records everything you hear, storing that data in the cloud. That information could later be retrieved by asking questions of the virtual assistant. Of course, this hinges on natural language recognition, which has been the holy grail for some time.
We'll get to see a good test of a computer responding to question-and-answer processing on February 24. That's when the IBM Deep QUA Project Watson computer (the same one that beat Kasparov at chess) will compete against champions on the TV show Jeopardy. It will spur the imagination of natural language work and put it in the spotlight. Atkinson believes this will be the nature of future mobile phone interface, probably between 2 and 10 years from now. It's inevitable though since he considers it the natural progression of the mobile device market with the device doing very little and the heavy lifting being done in the cloud.
It sounds like the Knowledge Navigator is getting closer to reality, at least according to Atkinson.