Google is celebrating the twentieth birthday of its search engine, and is continuing to make changes to the way we find information. Searches are getting more visual, and the results that Google delivers need to cater to what we're looking for -- like a stunning gown Jennifer Lopez wore at a red carpet, for example. Google will launch Featured Videos and further emphasize its existing AMP Stories in search. It'll also bring Google Lens to its image results so you can do more with the photos you find on the search engine.
Featured Videos will show animated previews of relevant results within the feed. If you're looking up attractions in New York, for example, a carousel of videos will show up about a third down the page, so you can get a clearer idea of how the physical landmark looks. Cathy Edwards, head of Google Images, told Engadget that the system will scan high quality videos to identify landmarks or things of interest to highlight just what was relevant to your search.
Say you're looking for the Eiffel Tower, for example, and Google knows there's a really good, high-quality video featuring attractions in France. The system can pick out just the footage on the tower itself, leaving out other bits like the Arc de Triomphe, for instance. Before, Google would simply show you a supercut preview of the video without focusing on your specific search terms.
Starting today, AMP Stories will also be more deeply integrated in the results, and show up near the top of the page. In an example during the keynote, an AMP Story about Awkwafina appeared at the bottom of her biographical card, after a Wikipedia summary. After tapping the card for her story, you'll see a set of full-screen pictures (or video) with text overlaid -- just like existing AMP Stories. Currently, this will only show up on mobile browsers and on results about "notable people."
Google also overhauled the ranking algorithm on Image results to take into consideration not just the quality and relevance of the picture itself, but also of the page where the photo lives. Edwards said this is meant to prevent situations where users find a picture of something they're looking for, like a pair of shoes, but then can't find the photo when they click through to a crowded host page. Now, Google Images will also consider the prominence of the picture on the screen, as well as factors like the authority of the website it's on.
There will also be tags on each image result, indicating if it's a product that you can buy. If it is, Google will also pull information like price and store name in a card for each item. You can also create Collections of pictures you've found, and Google will recommend that you set them up too. Then, when you search again for something similar, the system will remind you that you have an existing collection that you might want to add to or revisit.
On Google Images results, a new Lens button will appear at the bottom of each picture. Tapping on that will show you what it thinks are interesting parts of each photo, and show you similar products. In an image of a nursery, for example, Lens will highlight the crib and surface other options that you might like. That's just like Lens already does on mobile cameras. But you can also draw on another part of the image, like a bookcase in the background, to tell Lens to find similar products. Sadly, Lens for Google Images is only available on mobile devices for now, but hopefully it'll come to desktop browsers soon.