We've seen robots
do some pretty heroic things
in our time, but engineers from Georgia Tech, the University of Pennsylvania and Cal Tech have now developed an entire fleet of autonomous rescue vehicles, capable of simultaneously mapping and exploring potentially dangerous buildings -- without allowing their egos to get in the way. Each wheeled bot measures just one square foot in size, carries a video camera capable of identifying doorways, and uses an on-board laser scanner to analyze walls. Once gathered, these data are processed using a technique known as simultaneous localization and mapping (SLAM), which allows each bot to create maps of both familiar and unknown environments, while constantly recording and reporting its current location (independently of GPS). And, perhaps best of all, these rescue Roombas are pretty team
. Georgia Tech professor Henrik Christensen explains:
"There is no lead robot, yet each unit is capable of recruiting other units to make sure the entire area is explored. When the first robot comes to an intersection, it says to a second robot, 'I'm going to go to the left if you go to the right.'"
This egalitarian robot army is the spawn of a research initiative known as the Micro Autonomous Systems and Technology (MAST) Collaborative Technology Alliance Program, sponsored by the US Army Research Laboratory. The ultimate goal is to shrink the bots down even further and to expand their capabilities. Engineers have already begun integrating infrared sensors into their design and are even developing small radar modules capable of seeing through walls. Roll past the break for a video of the vehicles in action, along with full PR.
Team Robot: Autonomous Vehicles Collaborate to Explore, Map Buildings
May 15, 2011 Atlanta
There isn't a radio-control handset in sight as several small robots roll briskly up the hallways of an office building. Working by themselves and communicating only with one another, the vehicles divide up a variety of exploration tasks -- and within minutes have transmitted a detailed floor map to humans nearby.
This isn't a future-tech scenario. This advanced autonomous capability has been developed by a team from the Georgia Institute of Technology, the University of Pennsylvania and the California Institute of Technology/Jet Propulsion Laboratory (JPL). A paper describing this capability and its present level of performance was presented in April at the SPIE Defense, Security and Sensing Conference in Orlando, Fla.
"When first responders -- whether it's a firefighter in downtown Atlanta or a soldier overseas -- confront an unfamiliar structure, it's very stressful and potentially dangerous because they have limited knowledge of what they're dealing with," said Henrik Christensen, a team member who is a professor in the Georgia Tech College of Computing and director of the Robotics and Intelligent Machines Center there. "If those first responders could send in robots that would quickly search the structure and send back a map, they'd have a much better sense of what to expect and they'd feel more confident."
The ability to map and explore simultaneously represents a milestone in the Micro Autonomous Systems and Technology (MAST) Collaborative Technology Alliance Program, a major research initiative sponsored by the U.S. Army Research Laboratory. The five-year program is led by BAE Systems and includes numerous principal and general members comprised largely of universities.
MAST's ultimate objective is to develop technologies that will enable palm-sized autonomous robots to help humans deal with civilian and military challenges in confined spaces. The program vision is for collaborative teams of tiny devices that could roll, hop, crawl or fly just about anywhere, carrying sensors that detect and send back information critical to human operators.
The wheeled platforms used in this experiment measure about one foot square. But MAST researchers are working toward platforms small enough to be held in the palm of one hand. Fully autonomous and collaborative, these tiny robots could swarm by the scores into hazardous situations.
The MAST program involves four principal research teams: integration, microelectronics, microsystems mechanics, and processing for autonomous operation. Georgia Tech researchers are participating in every area except microelectronics. In addition to the College of Computing, researchers from the Georgia Tech Research Institute (GTRI), the School of Aerospace Engineering and the School of Physics are involved in MAST work.
The experiment -- developed by the Georgia Tech MAST processing team -- combines navigation technology developed by Georgia Tech with vision-based techniques from JPL and network technology from the University of Pennsylvania.
In addition to Christensen, members of the Georgia Tech processing team involved in the demonstration include Professor Frank Dellaert of the College of Computing and graduate students Alex Cunningham, Manohar Paluri and John G. Rogers III. Regents professor Ronald C. Arkin of the College of Computing and Tom Collins of GTRI are also members of the Georgia Tech processing team.
In the experiment, the robots perform their mapping work using two types of sensors – a video camera and a laser scanner. Supported by onboard computing capability, the camera locates doorways and windows, while the scanner measures walls. In addition, an inertial measurement unit helps stabilize the robot and provides information about its movement.
Data from the sensors are integrated into a local area map that is developed by each robot using a graph-based technique called simultaneous localization and mapping (SLAM). The SLAM approach allows an autonomous vehicle to develop a map of either known or unknown environments, while also monitoring and reporting on its own current location.
SLAM's flexibility is especially valuable in areas where global positioning system (GPS) service is blocked, such as inside buildings and in some combat zones, Christensen said. When GPS is active, human handlers can use it to see where their robots are. But in the absence of global location information, SLAM enables the robots to keep track of their own locations as they move.
"There is no lead robot, yet each unit is capable of recruiting other units to make sure the entire area is explored," Christensen explained. "When the first robot comes to an intersection, it says to a second robot, 'I'm going to go to the left if you go to the right.'"
Christensen expects the robots' abilities to expand beyond mapping soon. One capability under development by a MAST team involves tiny radar units that could see through walls and detect objects -- or humans -- behind them. Infrared sensors could also support the search mission by locating anything giving off heat. In addition, a MAST team is developing a highly flexible "whisker" to sense the proximity of walls, even in the dark.
The processing team is designing a more complex experiment for the coming year to include small autonomous aerial platforms for locating a particular building, finding likely entry points and then calling in robotic mapping teams. Demonstrating such a capability next year would culminate progress in small-scale autonomy during MAST's first five years, Christensen said.
In addition to the three universities, other MAST team participants are North Carolina A&T State University, the University of California Berkeley, the University of Maryland, the University of Michigan, the University of New Mexico, Harvard University, the Massachusetts Institute of Technology, and two companies: BAE Systems and Daedalus Flight Systems.