Google uses computer vision and machine learning to index your photos

Tags are so 2008. Google doesn't want you to waste time tagging your photos, except for the people in them. The web giant wants to be able to recognize more abstract concepts like "sunset" or "beach" automatically and attach that metadata without further input. In yet another post-I/O update, Google+ photos now uses computer vision and machine learning to identify objects and settings in your uploaded snapshots. You can simply search for "my photos of trees" or "Tim's photos of bikes" and get surprisingly accurate results, with nary a manually added tag in sight. You can perform the searches in Google+, obviously, but you can also execute your query from the standard Google search page. It's pretty neat, but sadly Mountain View seems to have forgotten what cats look like.

Google uses computer vision and machine learning to index your photos