AI helps drone swarms navigate through crowded, unfamiliar spaces

It could be key to self-driving cars, not to mention search and rescue.


Drone swarms frequently fly outside for a reason: it’s difficult for the robotic fliers to navigate in tight spaces without hitting each other. Caltech researchers may have a way for those drones to fly indoors, however. They’ve developed a machine learning algorithm, Global-to-Local Safe Autonomy Synthesis (GLAS), that lets swarms navigate crowded, unmapped environments. The system works by giving each drone a degree of independence that lets it adapt to a changing environment.

Instead of relying on existing maps or the routes of every other drone in the swarm, GLAS has each machine learning how to navigate a given space on its own even as it coordinates with others. This decentralized model both helps the drones improvise and makes scaling the swarm easier, as the computing is spread across many robots.

An additional tracking controller, Neural-Swarm, helps the drones compensate for aerodynamic interactions, such as the downwash from a robot flying overhead. It’s already more reliable than a “commercial” controller that doesn’t account for aerodynamics, with far smaller tracking errors.

This could be useful for drone light shows, of course, but it could also help with more vital operations. Search and rescue drones could safely comb areas in packs, while self-driving cars could keep traffic jams and collisions to a minimum. It may take a while before there are implementations outside of the lab, but don’t be surprised if flocks of drones become relatively commonplace.