Advertisement

Google neural network tells you where photos were taken

PlaNet doesn't need obvious landmarks to locate an image.

It's easy to identify where a photo was taken if there's an obvious landmark, but what about landscapes and street scenes where there are no dead giveaways? Google believes artificial intelligence could help. It just took the wraps off of PlaNet, a neural network that relies on image recognition technology to locate photos. The code looks for telltale visual cues such as building styles, languages and plant life, and matches those against a database of 126 million geotagged photos organized into 26,000 grids. It could tell that you took a photo in Brazil based on the lush vegetation and Portuguese signs, for instance. It can even guess the locations of indoor photos by using other, more recognizable images from the album as a starting point.

PlaNet isn't a foolproof system. It's only as good as the data it's fed, and a photo will only reveal so much by itself -- the network could only pinpoint 10 percent of images at the city level. Early tests hint that it's already much better than humans, however, since it has a much wider view of the world than even the best globetrotters. Provided the researchers refine the technology, you could see photography apps that locate images even when you don't have GPS turned on, or AI that can ask about your vacation without being prompted.