Advertisement

IBM and MIT team up to help AI see and hear like humans

Watson should help computers understand the world as well as humans do.

Andrew Spear for The Washington Post via Getty Images

Autonomous robots and other AI systems still don't do a great job of understanding the world around them, but IBM and MIT think they can do better. They've begun a "multi-year" partnership that aims to improve AI's ability to interpret sight and sound as well as humans. IBM will supply the expertise and technology from its Watson cognitive computing platform, while MIT will conduct research. It's still very early, but the two already have a sense of what they can accomplish.

One of the biggest challenges will be to advance pattern recognition and prediction. A human can easily describe what they saw happen in an event and predict what happens next, IBM says, but that's virtually "impossible" for current AI. That ability to quickly summarize and foresee events could be useful for everything from health care workers taking care of the elderly to repairing complicated machines, among other examples.

There's no guarantee that IBM and MIT will crack a problem that has daunted Google, Facebook and countless academics. However, it's rare that scientists get access to this kind of technology. You might just see breakthroughs that aren't practical for teams that have only limited use of AI-friendly hardware and code.