MIT's real-time indoor mapping system uses Kinect, lasers to aid rescue workers

We've seen the Kinect put to use to help you find your groceries, but the sensor's image processing capabilities have some more safety-minded applications as well. The fine minds at MIT combined the Kinect with a laser range finder and a laptop to create a real-time mapping rig for firefighters and other rescue workers. The prototype, called SLAM (for Simultaneous Localization and Mapping) received funding from the US Air Force and the Office of Naval Research, and it stands out among other indoor mapping systems for its focus on human (rather than robot) use and its ability to produce maps without the aid of any outside information, thanks to an on-board processor.

In the SLAM prototype, the processor is a laptop in the user's backpack, though the final product will be more along the lines of a handheld unit. The on-board laser scans a building in a 270-degree arc with a laser range finder, and the information it collects is combined with depth and visual data gathered by the Kinect before it's sent to the laptop to create the map in real time. Because the setup is tailor-made for humans, an inertial sensor is necessary to account for the wearer's gait. And because the system can detect a user's motion, it can create multi-floor maps when it senses activity on a staircase or elevator. The Kinect's camera is also used to determine whether a user has already been in a certain location, and to match up the data if it differs from the first walkthrough. Check out the automatic building mapping in action in the video below.