We ranked the Pixel 3 XL as the number one smartphone out there for camera tech, and with the launch of the Pixel 4, Google aims to keep that lead. Both the Pixel 4 and Pixel 4 XL now pack multiple cameras, with 12.2-megapixel f/1.7 main, 16-megapixel f/2.4 telephoto and 8-megapixel selfie cameras. That puts them nearly on par with the iPhone 11 and Huawei's P30 Pro -- apart from the unfortunate lack of an ultra wide-angle lens. As for video, the cameras can handle 4K at up to 30 fps, and 1080p at 120 fps.
The Pixel 4 will use a combination of optical zoom via the second camera, and the "Super Res Zoom" software introduced on the Pixel 3 to deliver sharp images. All you have to do is pinch to zoom in or out on the image, and you'll get what Google calls "optical quality" for every photo.
Gallery: Google Pixel 4 | 8 Photos
Gallery: Google Pixel 4 | 8 Photos
The Pixel 4 is a "software defined camera," according to Google, that puts computation on par with the lens and sensor. It's powered by the "Pixel Neural Core" that does motion, face and other kinds of tracking. It also calculates exposure and focus to deliver the best image whether in bright sunlight or dim nighttime shooting. For instance, Google showed that the Pixel 4 can render skin tones naturally, even in difficult situations like a very blue-hued ice cave.
One of the most notable camera features on the Pixel 4 is what Google calls "dual exposure." That mode helps you handle one of the trickiest lighting situations, shooting directly into sunlight. With a normal smartphone camera, your subject would be much too dark compared to the sky. However, the Pixel 4 has a separate slider adjustment for bright and dark parts of the images, so you can you lighten your subject and darken the sky to create a well-balanced composition. If you want your subject to be silhouetted, you can emphasize that instead.
The Pixel 4s can also handle the other extreme, low light shooting. You can boost the brightness on subjects shot at night, while also increasing detail in a starry sky. In fact, Google has an astrophotography option for the Pixel 4 that lets you use your phone instead of a DSLR to capture "stars, planets, and galaxies," it said.
The Pixel 4 also has a new portrait style of photography, with software that recognizes outlines of things like hair and fur that are normally hard to distinguish. That technology also delivers SLR-style blurred backgrounds commonly known as "bokeh." It looks impressive, but according to sample photos I saw, portraits can have a slightly artificial look, with subjects standing out too well against the background.
The Pixel 4 now includes an HDR+ feature that reveals detail in both bright and dark parts of an image. Thanks to Google's Live HDR+ feature, you can actually see how that will look before you take the shot, rather than guessing. Other features include "Social Share" that takes you quickly to social networks, and "Frequent Faces" that remembers your family members and friends to ensure they look their best.
Overall, this is an impressive display of cutting edge camera tech from Google. I'm slightly worried that the AI tech might be over-processing images, but Google has an good track record so far with with its previous Pixel phones.
Update 10/15/2019 5:23 PM ET: The article originally stated that the Pixel 4 phones didn't have wide angle lenses, but they're actually missing ultra wide-angle lenses. The article has been updated with the correct information. (Thanks, @mlebold!)