Although robots are getting better at adapting to the real world, they still tend to tackle challenges with a fixed set of alternatives that can quickly become impractical as objects (and more advanced robots) complicate the situation. Two MIT students, Jennifer Barry and Annie Holladay, have developed fresh algorithms that could help robot arms improvise. Barry's method tells the robot about an object's nature, focusing its attention on the most effective interactions -- sliding a plate until it's more easily picked up, for example. Holladay, meanwhile, turns collision detection on its head to funnel an object into place, such as balancing a delicate object with a free arm before setting that object down. Although the existing code for either approach currently requires plugging in existing data, their creators ultimately want more flexible code that determines qualities on the spot and reacts accordingly. Long-term development could nudge us closer to robots with truly general-purpose code -- a welcome relief from the one-track minds the machines often have today.