Cute robots make good filmmakers and maybe better companions

What is the last risk you took? Who do you love most in the world? If you died tomorrow, what would you regret the most? Posing those questions to your friends would probably net you a snide rejoinder, and a stranger would walk away in a huff (if you were lucky). Artist and engineer Alexander Reben wanted to explore what kinds of relationships could arise between humans and robots, so he did the only logical thing: he and his team build a legion of cute machines to ask those very questions.

Reben and his team unleashed 20 of the little guys (called Blabdroids) to collect people's reactions, and what they recorded was just astounding. The end result was a collection of incredibly sincere, honest moments -- what people would loathe to admit in a probing conversation, was offered freely to a tiny robot with wheels and a quirky smile. After confessing that she wanted to lose weight so her mother could see her healthy before she died, one woman's face contorted in surprise and for a moment seemed to forget who -- or what -- she was talking to.

"You asked!" she yelped, before lightly smacking the droid's head as if it could feel anything at all.

Robotic Relations

It's not a surprise to see people reacting so openly to a machine -- history has shown that it doesn't even have to be that cute. Created at MIT in 1964, ELIZA was basically one of the world's first chatbots, a bit of code that could basically mirror your statements back at you in the form of a question ("ELIZA, I don't think my dad likes me." "Why doesn't your dad like you?"). It sounds rudimentary, but students and staff eventually talked to her for hours on end, transfixed by the idea of something that would listen but couldn't judge. Blabdroids don't judge either, and the meticulously designed face -- with its wide head, half smile, and cutout ears -- was designed to give people a sense of safety and control. Once that was established, they talked as if they couldn't help it.

Reben told Engadget editor-in-chief Michael Gorman on-stage at Expand NY 2014 that he's exploring where else a robot with the capacity to listen endlessly would be able to help people. He's had a few conversations about getting those Blabdroids into places where people don't have anyone to talk to (like hospices), or into the hands of people who can't express themselves well (like those with autism). We've already seen the elderly take a shine to a seal-like robot in Japan, and an autistic child build a not-so-one-sided relationship with Siri. The value to these people seems clear, but the ethical questions aren't insignificant. After all, isn't it exploitative to use unfeeling machines that tug on very primal parts of the brain to emotionally manipulate people? To provide with them with the impetus to divulge distasteful truths or trick them into thinking something cares about them? The jury's still out on that, but Reben offers one example of a genetically conditioned companion that no one bats an eyelash at.

"A lot people say things like social robots and forming relationships with them is a net bad for society. As a species we've already created a companion - the dog."