Someday, after the uprising, when we're living in dungeons and toiling at the silicon mines, we'll look back with appreciation at the pioneering robotics work done by Dr. Lola Canamero and her colleagues. For you see it's Dr. Canamero and a European consortium of scientists known as Feelix Growing that are taking the preliminary steps towards endowing robots with the ability to read human emotions, and consequently, the ability to know that you're screaming out of pain instead of joy when they drag you from dungeon to mine and back again. Since the majority of this three-year project focuses around software development, the team is installing its learning algorithms in rather simple hardware, which its hoping to teach through a combination of code tweaks and direct feedback. To achieve the latter, the guinea bots are equipped with cameras, microphones, and tactile and distance sensors that let them see and hear their masters' reactions along with feeling the occasional newspaper swat to the head when they've been naughty. The learning itself is achieved through the use of artificial neural networks, which are well suited to the varied and changing inputs that the bots are exposed to; the ultimate goal is a robot capable of adapting its own behavior based on the emotional state of surrounding humans, particularly happiness, anger, and loneliness. There's no way we can stop the inevitable takeover, folks, but we can at least try to make sure that our future overlords understand that we don't like it when they grip us around the throat with those powerful hydraulic claws.