ARKit artist transforms the world into a Cubist nightmare

278 Shares
Share
Tweet
Share
Save

    Sponsored Links

    Every new form of technology gets re-purposed for art and nobody is doing that better with augmented reality right now than educator and "code poetry" researcher Zach Lieberman. In a recent experiment called "exploded camera," he used Apple's augmented reality framework ARKit to map objects and textures in a room. "I kept thinking about ways of pushing it in different directions," he told Engadget. "The idea was that the image would look fine from one vantage point, but as you rotate the camera, you'd see the pieces floating in space."

    The technique reminds his Instagram fans of Cubism, but ARKit's sensor and camera capabilities take it in another direction. At first you believe you're looking at a 3D plant, but when the camera pushes in, it turns into a bunch of 2D planes stacked to look like a plant. "This threshold of perception seems interesting," Lieberman notes. "I am really fascinated with how we can create images that look fine from one direction but totally broken from another."

    It's also an apt metaphor about how we're fooled by pop culture: Popular comic-based and other VFX-heavy movies, for example, simulate realism by adding animated characters, crowds and dramatic backgrounds in 2D layers. With the VFX breakdown of a film like Logan, however, the illusion is destroyed when each element is seen in isolation.

    More exploded camera - fixed some depth issues. #openframeworks

    A post shared by zach lieberman (@zach.lieberman) on

    Virtual and augmented reality are themselves forms of trickery, messing with your depth and visual cues to create the illusion of virtual objects inside of reality. Lieberman's "exploded pictures" deconstructs that, while enhancing it in another mind-bending project that leaves a trail of 3D audio waveforms through am iPhone-shot video.

    Apple's ARKit, which we'll likely get a good look at during its big iPhone X reveal, uses something called SLAM, or simultaneous localization and mapping. In effect, it controls the iPhone's sensors and camera to map a room's boundaries and objects, while determining exactly where it is. Developers can then insert game characters and other virtual objects, much as Microsoft does with Hololens.

    Lieberman's technique is nearly seamless, except for when the scene is first broken into elements. "I may work to thread the segmentation since the app freezes on taking the picture," he points out. However, the roughness is part of what makes it fun and separates it from slick commercial projects by companies like IKEA. "I love the whimsical, ridiculous side of AR," he adds. "I know a lot of companies and brands will explore how to use this stuff practically but artists are great for imagining different possible futures."

    All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
    Comment
    Comments
    Share
    278 Shares
    Share
    Tweet
    Share
    Save

    Popular on Engadget

    The 2019 Engadget Holiday Gift Guide

    The 2019 Engadget Holiday Gift Guide

    View
    Qualcomm teams up with 'Pokémon Go' developer to make AR glasses

    Qualcomm teams up with 'Pokémon Go' developer to make AR glasses

    View
    Qualcomm pushes for cheaper Snapdragon PCs with its 7c and 8c chips

    Qualcomm pushes for cheaper Snapdragon PCs with its 7c and 8c chips

    View
    Microsoft's redesigned Office mobile apps read text out loud

    Microsoft's redesigned Office mobile apps read text out loud

    View
    'NHL 20' adds Snoop Dogg as a commentator and playable character

    'NHL 20' adds Snoop Dogg as a commentator and playable character

    View

    From around the web

    Page 1Page 1ear iconeye iconFill 23text filevr