Advertisement

Here's all the new stuff Google's Pixel 3 phone cameras can do

Group selfies, no blinking and improved low-light capability.

The Pixel 2 had arguably the best smartphone camera on the market, and Google wants to make sure it stays that way. During its Pixel unveiling today, it introduced a raft of new camera features for the Pixel 3 and Pixel 3XL smartphones, including an improved zoom, wider-angle camera, smile and blink detection, bokeh control and more -- all with just a single lens on the back. The quality is apparently good enough for Terrence Malick, who shot a video that was featured at the event, so it might be good enough for the rest of us, too.

Some of the features are enabled with the fresh hardware, to be sure. There's a brand new 12.2-megapixel sensor on the back, with a sharper wide-angle lens to allow for zooming. On the front, you get a pair of wide-angle lenses in that large notch, letting you adjust the zoom level via a slider.

However, most of the heavy lifting is done by Google's AI, which can handle HDR+ chores, merge multiple photos for zooming and low light and pick the best photo from a sequence. It's much more powerful than ever thanks to the Visual Core co-processor Google developed in conjunction with Intel. The chip powers more camera features than it did on the Pixel 2, and is powerful enough that you don't need to be online to get the benefits.

Google has incorporated its famous AI into the cameras on the Pixel 3 in a more comprehensive way than ever before. Married with HDR+ capability, a feature called Top Shot captures a sequence of images, then checks them all and recommends the best one that has all your subjects smiling, facing the camera and not blinking.

If you're more into self-portraits, there's another feature called Photobooth. Again, the AI can determine when you're smiling or making a face. It then snaps the photo on its own so you don't have to do that awkward shutter button reach and possibly drop your device into a canyon.

Group Selfie gives you a field of view that's 184 percent wider than the normal mode. You can activate it by double-tapping the power button, switching to selfie mode (by double flicking your wrist) and then double-tapping. Or you can simply ask the Google Assistant to "take a group selfie." At that point, you can fit yourself, your friends, and/or a scenic backdrop into the scene.

If you want to go the other way, there's the Super Res Zoom. It's still a digital rather than optical zoom, but the Pixel 3 takes multiple photos in order to reduce the grain. So in the end, you get a telephoto, sharp, non-pixelated picture without needing a second lens. Because of the multiple exposures, however, you'll have to keep your phone steady when you shoot that way.

Oftentimes you must make a difficult choice when shooting photos around a fire or at a bar or concert. If you don't use the flash, the shot will be unusably grainy, but if you do use it, it could look artificial and non-flattering. The Pixel 3's solution to this is Night Sight (yes, Google has a fancy name for all these features).

Cameras like Sony's A7 III can easily handle low light nowadays, but how does the Pixel manage with a tiny smartphone sensor? As with the zoom feature, Night Sight takes multiple exposures in order to reduce the grain that would normally make your photos look like garbage. That gives you bright, evenly toned shots that better capture that evening mood. The only drawback, again, is that you have to hold the phone still while shooting and your subjects can't move around much either. Note that this feature won't be available right away -- it'll come along in a future update.

Apple's latest iPhones like the XS and XS Max let shooters blur out the backgrounds to make their subjects stand out by using multiple cameras, but Google does the same thing with a single lens. You can adjust the level of bokeh-flavored blur in the shot after you take it simply by adjusting a slider. Again, this is done using AI and multiple exposures.

My jaw dropped when I saw that the normally reclusive filmmaker Terrence Malick had produced a short video for Google. It seemed the company wanted to flaunt its Motion Auto Focus feature that ensures your subjects stay as sharp as possible. Malick's video shows a lot of close-ups and the camera never seems to stop moving, yet everything does indeed stay sharp.

That's thanks in large part to Motion Focus, which figures out what your subject is and keeps track of it. On top of that, the Pixel 3 comes with front-facing video stabilization, keeping things steady if you're walking or moving around while recording yourself.

Thanks again to Google's AI, it's now a bit easier to share photos of specific people thanks to Google's new Live Albums. If you're constantly sending baby pictures to Grandma, you can create a specific Live Album. Then, whenever you snap a shot, Google will automatically add it with no manual labor required.

Finally, there's the new Playground, which lets you incorporate animated stickers of weather, pets and other things into your selfies, photos or videos. Google has added characters like Hulk and Iron Man from the Marvel Studios universe, and later this year you'll get a Childish Gambino character to dance with thanks to a collaboration between Google and Donald Glover.

Follow all the latest news from Google's Pixel 3 event here!

Video
Presenter: Chris Velazco
Script: Chris Velazco
Script Editor: Terrence O'Brien
Camera: Taylor Ligay
Editor: Kyle Maack
Producer: Michael Morris

Top Shot and Photobooth modes

Group Selfies and Super-Res Zoom

Low-Light Capability

Portrait Mode and bokeh

Motion Auto Focus

Live Albums and Playground