For the same shot, for instance, you can create different effects like "Natural Light," "Contour Light" and "Stage Light" (shown left to right, below). "You compose a photo, the dual cameras and the ISP sense the scene, they create a depth map, and they actually change the lighting contours over the face," said Apple iPhone chief Phil Schiller. "These aren't filters, this is real time analysis."
Apple had promised to shrink down its AI systems onto a chip for use with its augmented reality ARKit and other things, and using it with the camera seems a smart choice. Assuming it doesn't make shots look artificial in real world use (it looked great in the demo), it should be a handy feature.
The new camera shoots a lot quicker too, thanks to a new image signal processor that's part of the A11 Bionic chip. It improves pixel processing, captures more colors, speeds autofocus and adds a new feature called "hardware-enabled multi-band noise reduction" that can eliminate a lot of the grain you'd see in low light. "Phone 8 takes fantastic portrait modes, and now you're going to get more detail and even a more natural bokeh," says Schiller.
Video also gets a boost, with 60fps 4K and 240fps slow-mo available at 1080p. There's also a new, third video category called "augmented reality." Since video is a key part of its ARKit platform, it should help developers create even more of the wild experiments going on right now.
Follow all the latest news from Apple's iPhone event here!