Latest in Gear

Image credit: Chris Velazco / Engadget

These days, Apple is content to follow trends, not set them

Computational photography was cool when Google did it.
1022 Shares
Share
Tweet
Share
Save
Chris Velazco / Engadget

Sponsored Links

Watching this week's Apple event gave me a sense of deja vu. With every new feature the iPhone maker announced, I felt like shouting something along the lines of, "The Simpsons already did it!" It felt as if everything Apple was doing was a riff on something another company had tried and tested before. Sure, Apple might be taking what others did and (possibly) making it better (maybe). But the company is also letting others take risks and innovate in its place, particularly when it comes to photography -- an area where it used to shine.

Take, for example, the new Deep Fusion computational photography feature that Phil Schiller described as "way cool." It's an image-processing system that taps the A13 Bionic's neural engine and uses machine learning. According to Apple, this system will "do pixel-by-pixel processing of photos, optimizing for texture, details and noise in every part of the photo." Deep Fusion will be available later this fall, so we don't know yet how effective it might be. Apple did show sample shots of its Night Mode tool that will improve low-light photography, and those results looked impressive.

That latter feature is the most obvious example of Apple's attempts to outdo its competitors. If you recall, Google's Night Sight launched last November and made it possible to take relatively clear photos in near-total darkness. And Google wasn't even the first to try this: It was just the most effective. Huawei, LG and Samsung have all offered their own take on the feature in previous flagship phones to varying degrees of success. Apple's Night mode promises to do pretty much the same, though how well it works remains to be seen.

It wasn't always this way though. In the middle of the megapixel race, when smartphone makers were focusing on cramming sharper sensors onto their phones, Apple did something truly different and thoughtful. It stopped at 12 megapixels and turned its attention to features like autofocus and low-light performance, working to increase pixel size for better quality. The iPhone 5S and iPhone 6 ranked as the best phone cameras during their time thanks to these improvements too. Apple was even ahead of the game when it introduced things like Portrait mode.

iPhone 11 Pro

When it adopted dual cameras with the iPhone 7 Plus, Apple also opted for a more compelling setup than the competition at the time was using. It went with a telephoto lens as the secondary camera rather than a monochrome one for detail like in the Huawei P9 or the wide-angle option on the LG G5. Apple's approach soon became the most popular pairing in the industry. Nowadays, though, Apple is seen as lagging behind Samsung, Huawei and even LG in picking up on trends, not to mention setting them.

On the hardware front, Apple belatedly jumped on the ultrawide-angle trend this year. It added cameras with a 120-degree field of view to all three new iPhones. LG was one of the earliest to test out this concept when it added a super wide lens to the G5 in 2016.

It seemed gimmicky at first, but when people (myself included) started seeing the versatility it brought to smartphone photography, LG's rivals followed suit. Now the Galaxy S10, S10+ and Note 10 as well as the Huawei P30 Pro all have ultrawide options as well. Apple is just the latest to get on board. (It's worth noting that for all the praise thrown at Google for its prowess in photo processing, the Pixels still don't have ultrawide-angle lenses.)

iPhone 11 Pro camera

Now Apple's updated camera interface, which lets the user see the wide-angle view while framing up a shot with the main camera, is unique. But in simply adding a third, ultrawide sensor, Apple isn't doing anything that other phones aren't.

It's also not just the smartphone industry that Apple borrows ideas from. With the Apple Watch Series 5, the company also introduced a new Always On Display that means the wearable will tell time, well, all the time. Yeah, pretty much all other smartwatches with color touchscreens have had this for a while now. Apple's new women's health-tracking feature also follows in the footsteps of Fitbit and Garmin. Sure, Samsung and Google have yet to integrate this, so Apple isn't the slowest in this race, but it certainly isn't breaking new ground.

Innovation comes with a measure of risk, and it's understandable that Apple wants to play it safe. The company's wait-and-see attitude isn't news -- plenty have called out how far behind it is compared to its rivals. And frankly, it's been a long time since Apple's surprised the industry with a fresh idea that's made us all go, "Wow, why didn't anyone think of this before?" Sometimes you almost forget that the iPhone was once the leader of the pack rather than just a member of it.

Follow all the latest news from Apple's 2019 iPhone event here!

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
1022 Shares
Share
Tweet
Share
Save

Popular on Engadget

Engadget's Guide to Privacy

Engadget's Guide to Privacy

View
Oppo's next phone can be fully charged in just 30 minutes

Oppo's next phone can be fully charged in just 30 minutes

View
Fossil's latest hybrid watch is likely powered by Wear OS

Fossil's latest hybrid watch is likely powered by Wear OS

View
Sonos Move review: Versatility doesn't come cheap

Sonos Move review: Versatility doesn't come cheap

View
ZenBook Pro Duo review: ASUS makes a case for dual-screen laptops

ZenBook Pro Duo review: ASUS makes a case for dual-screen laptops

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr