Advertisement

Google Lens resurfaces questions about AI and human identity

It can identify boys as boys, but should it?

Google

Today at the company's annual developer conference, Google CEO Sundar Pichai uttered a phrase that will no doubt be repeated in corporate boardrooms across the world for the foreseeable future: "AI first." It wasn't the first we've heard of the formerly "mobile-first" company's focus on artificial intelligence, but Google I/O 2017 marked the first time we saw many of the tools that will back up that new catchphrase.

"When we started working on search, we wanted to do it at scale," Pichai said during the conference's opening-day keynote. "That's why we designed our data centers from the ground up and put a lot of effort into them. Now that we're evolving for this machine-learning and AI world, we're building what we think of as AI-first data centers."

But it was a quick, innocent enough reference to the company's image-recognition software, preceding the announcement of Google Lens, that should raise some red flags. In trying to appeal to the family women and men in the audience, Pichai revealed that the platform could recognize and tag your "boy" as such when you snap a pic of him blowing out the candles on his birthday cake.

"Similar to speech, we're seeing great improvements in computer vision. So when we look at a picture like this we are able to understand the attributes behind the picture. We realize it's your boy, in a birthday party, there was cake and family involved and your boy was happy. So we can understand all of that better now."

He went on to say that Google's image-recognition systems are now "even better than human." The screenshot, showing what the image-recognition system identified in the photo included a flurry of words including: people, children, child, family, arm, hand, joy, surprise, birthday, party, laughter, family, women, and, yes, "boy."

What could possibly be wrong with your camera identifying and tagging your "boy" at this "birthday" and the "woman's" "hands" holding him in front of his "cake"? Well, nothing, if that's really who is in the photo and what is going on. But as Google and other tech giants have proved, AI isn't always as enlightened as the people who created it, and gender identity isn't always visually recognizable. Tagging an image of a boy as "boy" isn't an issue, but what if that child is transgender, gender-fluid or gender-nonconforming? Gender identity, as we've come to learn, is a divisive issue, and a very personal one.

How Google's AI will account for those of us who don't fit stereotypical gender norms is still unclear, but here's hoping it's learned from its mistakes. Google is no stranger to the slippery slope of AI image recognition and the very complex subject of human identity. Back in 2015, the company's photo app tagged and organized pictures of at least one developer's photos of black friends as "gorillas." The incident shed light on the limits of image-recognition algorithms and the people that program them. Google followed with an apology in a widely published statement.

"We're appalled and genuinely sorry that this happened. There is still clearly a lot of work to do with automatic image-labeling, and we're looking at how we can prevent these types of mistakes from happening in the future."

There's a difference between labeling a black person as a gorilla and misgendering an individual, but both come with their own set of historical, political and societal problems.

In a look at the future of Google Photos on The Verge, Anil Sabharwal, the head of product, focused on the diversity of individual organizational habits, saying:

"Suggestions, patterns, the people that are important in your life — how we bring those to you at the appropriate time," Sabharwal says. "Everyone is different."

Let's just hope Google's AI recognizes that.

For all the latest news and updates from Google I/O 2017, follow along here.