Google's AR tools make it easier for apps to apply face filters

Even on phones that don't have iPhone X-like depth-sensing hardware.

Augmented reality experiences are still in their relative infancy, but because Android runs on so many devices, it can't always assume they'll have dedicated hardware to create extra effects. While Apple is already pushing ahead with AR and packed an entire Kinect-like camera into the front of its iPhone X family to support it, Google is enhancing its software to work even without capabilities like the LG G8's depth-sensing hardware.


The latest release of ARCore, version 1.7, can create a 468-point 3D mesh of a user's face from just the front camera alone, good enough to apply slick filtering effects like the ones seen in this GIF. The key is making sure apps can track where to put everything, and avoid weird artifacting like you sometimes see with things like Snapchat filters. Also, 9to5Google points out the the list of devices that can run ARCore is expanding, with new additions like the Moto G7 family and Vivo Nex Dual Display Edition.

Beyond just the Augmented Face API, with "ARCore Elements." Those allow developers to easily implement technology that helps software detect surfaces to place virtual objects on, and then use gestures to manipulate or resize them. So far, many AR-powered experiences in apps have launched on iPhone first -- AR+ mode for Pokémon Go took almost a year to make the jump -- but with improving software development tools hopefully we'll see more feature available to owners of varying Android devices.