Advertisement

The only thing keeping robots down is you

The robots are coming. And I don't mean to the factory floor, or your kid's toy box. I mean to your living room, your office and your everyday life. The question is no longer a matter of if, but when. Some might even wonder why we don't already have a robot in every home. Designers will tell you they know how to build a successful home robot. They know the key is the ability to build social, if not emotional relationships. And they have a whole bag of tricks and research they can turn to for help. We haven't seen the level of artificial intelligence needed in consumer products yet, but it certainly seems as if we're getting pretty close in the lab. So if it's not a question of technology or design, what's the holdup?

One robot, many jobs

SXSW: These Robots Want to Be Your Friend

During SXSW this year, protesters marched through the streets of downtown Austin chanting anti-robot slogans and carrying signs that read "Stop the Robots." The truth is, the protest was just a publicity stunt. They were there to promote a dating app, but they accidentally tapped into some very real concerns about the future of robots. They're getting smarter and becoming more commonplace. And that's making people nervous. Even some of our brightest minds are worried about a world where robots eventually become smarter and stronger than humans. And hoax or not, that protest struck a chord with people who have been fed a steady diet of warnings from the likes of Elon Musk and Stephen Hawking.

Even some of our brightest minds are worried about a world where robots eventually become smarter and stronger than humans.

That poses a very serious obstacle for someone like Cynthia Breazeal, the director of the Personal Robots Group at the MIT Media Laboratory and the founder of the home robot company Jibo. She is desperately trying to get us to welcome robots, like Jibo, into our lives. She understands we have a long road to walk before there's a personal robot in every home.

Beyond public anxiety, there are obvious technical hurdles too. For example, one necessary quality, according to Breazeal, is the ability to perform multiple tasks. A Roomba is good at vacuuming, but its limited functionality means it's never going to become an integral part of the family unit. It frees people to perform other, more "uniquely human tasks" as filmmaker Tiffany Shlain would say, but you only interact with it when the floors need cleaning. That's not really developing a "social" relationship with a robot. Sure, you can force personality on a Roomba by strapping an iPod dock to the top of it, but it's not capable of reading your emotional state or reacting to social cues. And those are the cornerstones of the "socio-emotive AI" that Breazeal has made the focus of her career.

Cynthia Breazeal's chart of social robots

Toward the end of her SXSW presentation, "The Personal Side of Robots," Breazeal showed a chart plotting functionality on one axis and emotional engagement on the other. Basically every robot fell into one of three sectors. In the bottom left were single-purpose, non-social robots like the Roomba; in the top left were multi-purpose industrial bots. The bottom right was home to the cuddly toys like AIBO and Paro. But the top right was completely empty. That is where the multi-purpose and social robots would go, if such things existed right now. Breazeal is just one of many people who believes filling that niche will happen soon. Jibo is just the latest of her projects. Her previous work at MIT on the expressive Kismet and Mogwai-like Leonardo has shown that the technology to make social robots a reality is within our grasp. (If it wasn't, the protests at South by Southwest probably wouldn't have fooled anyone.)

We like them, but not like that

The bigger challenge will be in getting people to accept Jibo as part of the home. But this is where something interesting happens. Breazeal insists that we need to form social bonds with our robots and stop viewing them as "stuff." But when asked if we needed to form "emotional" bonds with a home robot, she hesitates. One of the biggest champions of putting a "social" robot in every home seems to draw the line at building an emotional relationship, because that word is "loaded."

Humans have demonstrated a surprising ability to feel empathy for robots.

Obviously, Breazeal doesn't expect people to love their robots the way they would a pet, but to build that essential social bond we will need to relate to them in some way. It appears that there is potential for that; humans have demonstrated a surprising ability to feel empathy for robots. Countless people, like Richard Fisher, the deputy editor at BBC Future, as well as Freedom Baird and Kate Darling from the MIT Media Lab, have tackled the subject.

Baird discussed the topic on the popular Radiolab podcast. She, along with hosts Jad Abumrad and Robert Krulwich, had a group of children play with a Barbie doll, a hamster and a Furby. And once they'd gotten familiar with each, they were asked to hold it upside down for as long as they felt comfortable. The kids had no issues dangling the completely stoic Barbie upside-down for as long as possible. Most stopped after about five minutes, but only because their arms began to hurt. The hamster lasted only eight seconds before the kids felt bad and had to turn the squirming creature back over. The Furby falls somewhere in between, but tracks closer to the hamster. When turned over, the Furby begins to cry and say it's scared. The kids knew it was just a toy and not really alive, but they still felt guilty. One even said, "I didn't want him to be scared."

Darling, who is a research specialist at the Media Lab, has been travelling the globe performing a similar experiment. In workshops, she's been asking people to torture a Pleo dinosaur toy. Not surprisingly, people had trouble bringing themselves to harm the adorable dino, even though they knew it couldn't actually feel pain. Dr. Astrid Rosenthal-von der Pütten from the University of Duisburg-Essen took this idea of torturing Pleo toys and decided to get some cold, hard data. She monitored her subjects' brains using fMRI while having them watch videos of a woman in a green shirt, a green box and, of course, the green dino bot. In some of the videos, the person or objects were treated with affection. In others, they were treated roughly or harmed. Rosenthal-von der Pütten found that the same areas of the brain lit up when either the Pleo or the woman was strangled and hit. But let's be clear, feeling bad for something or someone when it's harmed is a far cry from welcoming it into your home. I feel bad when I see kids trying to kick pigeons; that doesn't mean I want to make them my pets.

They're cute, when they're not being creepy

Cuteness is sort of a shortcut to bonding with robots.

Part of the reason people reacted so negatively to the Furby and Pleo being hurt was certainly because they're kind of cuddly. Alex Reben, an engineer, documentary filmmaker and (yet another) MIT Media Lab graduate, would probably tell you that cuteness is sort of a shortcut to bonding with robots. His BlabDroid has convinced plenty of people to confess their fears, secrets and dreams thanks entirely to its disarmingly adorable design. Its squat, smiling cardboard body is plenty approachable on its own -- and then it starts asking questions in the voice of a 7-year-old boy with a mild lisp.

The Future Starts Here S2:E3 | Robots, Botox, And Google Glass

While there's clearly value in anthropomorphizing a robot, there is the danger of going too far. Shlain has spent a good amount of time thinking about the Uncanny Valley and how things that are too close to human, without being 100 percent convincing, tend to trigger alarms in our brains. She specifically suggested that "eyes are that crucial thing for knowing if something is real." And, the more "realistic" the eyes are, the more uncomfortable they make us. Just take a look at Kodomoroid and Otonaroid, the robot newscasters from Japan. Even animators have to worry about crossing that line, so as not to make audiences uncomfortable.

Robot news anchor Kodomoroid

It would seem that the key is to pick one thing (preferably not the eyes) and make it either human- or pet-like enough to inspire someone to form a bond with it. That could mean giving it a human voice, a face or a soft, cuddly body. But we don't necessarily need our robots to talk to us. As Andra Keay, founder of the startup incubator Robot Launchpad, rightly points out, films like Star Wars have shown how much we can decipher from just a few beeps. They're our blueprint for socializing with robots.

We're going to Hollywood

A mock protest at SXSW featuring anti-robot signs

Using films as our touchstone for relating to robots highlights a problem that is somewhat unique to the West: a lack of trust. One of the most popular robots in Japanese pop culture is the manga hero Astro Boy; in America, our robot icons are the Terminator and HAL 9000. In places like Japan, robots are often celebrated as heroes, and creations like Pepper have had an easier time finding acceptance. At best in the US, we think of robots as putting factory workers out of jobs. At worst, we see them as cold, emotionless killers.

This leads us to an important, and perhaps uncomfortable, requirement for us to welcome a robot into our homes -- it needs to make us feel superior. And not just slightly superior, but completely in control. If there's any chance that a robot could be perceived as a threat either physically or intellectually, we'll never welcome it into our homes. So we need to intentionally handicap robots in order to make us feel comfortable. And this artificial limitation makes it incredibly difficult to build something that is both smart enough to recognize and respond to our emotional state, and capable of performing multiple physical tasks.

In America, our robot icons are the Terminator and HAL 9000.

These are the reasons Reben built Boxie (the precursor to BlabDroid) out of cardboard and made it small enough to hold in your hand. If you should suddenly perceive this adorable little bot as a threat, you could simply toss him to the floor and stomp on him. For the same reason, Keay says that it's important for a robot to have an obvious kill switch. And while we said before that people are uncomfortable with the idea of "killing" a robot, they're clearly even more uncomfortable with the idea of a robot being "alive."

Some are so uncomfortable with the idea of an artificial intelligence advanced enough that it could conceivably be considered "alive" that they've suggested creating a third ontological category for robots. If we're going to develop social relationships with them, we can't think of them the same way as we do a toaster. But we're also clearly uncomfortable with thinking of them as alive, just like you and me. It would seem that even the social robots' biggest advocates can't avoid using the language of "other" when talking about artificial intelligences and robots.

'The Other'

The film Her tackles this subject head on. In it, Joaquin Phoenix is subject to ridicule and ostracized for his romantic relationship with the AI voiced by Scarlett Johansson. At one point, Theodore, played by Phoenix, sits down with his soon-to-be ex-wife Catherine. When he reveals his relationship with his OS to her, Catherine is appalled and refers to Samantha (Johansson) derogatorily as a "computer."

The rhetoric around robots and artificial intelligence has taken on a vaguely xenophobic slant.

Mark Stephen Meadows, the CSO of Geppetto Labs (which built a virtual doctor's assistant that can help diagnose simple ailments) echoes this sentiment. He believes we should think of artificial intelligence like "prosthetics." He even went so far as to tell an audience at SXSW that the idea of a robot is "a load of crap." According to him, they're just "weirdly shaped computers."

In general, the rhetoric around robots and artificial intelligence has taken on a vaguely xenophobic slant as they edge closer and closer to reality. If the alarmists, like Hawking, are to be believed, robots steal our jobs, threaten our way of life and are coming for our women (or men) next. They're a weaponized "other" to be feared, not something we should be welcoming into our homes. Even their biggest advocates feel the need to put up semantic and cultural barriers. And ultimately, that is the biggest obstacle. Not technology or design or a lack of knowledge, but a cultural bias and distrust. And the closer those robots come to resembling humans (in either appearance or intellect), the more we fear them.

[Image Credits: Stop the Robots/Quiver (protest); Annapurna Pictures (still from Her)]