We've been traveling around the country the past week, checking out some of the latest goings-on in the wide world of robotics. Amongst the most prevalent themes we've seen across the projects is a sense of real-world uncertainty -- which is to say that any number of things can go wrong when you take a robot outside of its laboratory comfort zone (one roboticist told us about a prototype that malfunctioned thanks to reflections off a nearby building). This is certainly the case in the world of manufacturing robotics, where it's hard to maintain any semblance of the sterile consistencies afforded by testing grounds. MIT grad student Steve Levine showed us a project designed to help manufacturing robots constructed of unreliable parts operate in unreliable environments.
The Barrett WAM robot arm is controlled via voice, tasked, in the case of the demo, with moving around a handful of brightly-colored blocks. Utilizing four off-the-shelf webcams, the system creates a 3D environment of the space, visible for our purposes on a nearby projector. As Levine puts it, the lab is "trying to make robots that can automatically sense their environment," meaning, in the case of the demo, that you can move a block around and the arm will correct the discrepancy, moving it off the top of the pile and readjusting things to where they should be. The project foresees a bit of a utopian world wherein robots and humans work side by side on factory floors, helping one another out and correcting potential mistakes.
Check out a video of Levine and his robot arm working in relative harmony after the break. More info on the project can be found in the source link below.