For years now, researchers have been exploring ways to create devices that understand the nonverbal cues that we take for granted in human-human interaction. One of the more interesting projects we've seen of late is led by Professor Peter Robinson at the Computer Laboratory at the University of Cambridge, who is working on what he calls "mind-reading machines," which can infer mental states of people from their body language. By analyzing faces, gestures, and tone of voice, it is hoped that machines could be made to be more helpful (hell, we'd settle for "less frustrating"). Peep the video after the break to see Robinson using a traditional (and annoying) satnav device, versus one that features both the Cambridge "mind-reading" interface and a humanoid head modeled on that of Charles Babbage. "The way that Charles and I can communicate," Robinson says, "shows us the future of how people will interact with machines." Next stop: uncanny valley!
Cambridge developing 'mind reading' computer interface with the countenance of Charles Babbage (video)
In this article: cambridge, charles babbage, CharlesBabbage, communication, computing, emotion, emotions, gps, humanoid, humanoid robot, HumanoidRobot, interface, mind reading, MindReading, nonverbal body language, nonverbal communication, NonverbalBodyLanguage, NonverbalCommunication, peter robinson, PeterRobinson, robot, robotics, robots, satnav, uncanny valley, UncannyValley, University of Cambridge, UniversityOfCambridge, video
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.