ComputationalPhotography

Latest

  • Google

    Google flips on Pixel 2's HDR+ feature for your go-to photo apps

    by 
    Mallory Locklear
    Mallory Locklear
    02.05.2018

    The Pixel 2 and Pixel 2 XL include Pixel Visual Core, Google's first custom imaging chip that allows for HDR+ quality pictures in third-party apps like Instagram and Snapchat. But Pixel 2 users haven't been able to take advantage of those capabilities because the co-processor hasn't been enabled. Well, they'll be able to soon because Google announced today that it's turning on Pixel Visual Core, bringing the HDR+ technology that's been available through the Pixel 2's main camera app to other photography, social media or camera apps.

  • New 3D camera chip design might put Adobe on guard

    by 
    Joshua Topolsky
    Joshua Topolsky
    02.21.2008

    You'd better watch your back Adobe, because it looks like you've got company in the 3D picture game. Stanford University researchers have recently hit upon a method of image sensing which can judge the distance of subjects within a shot. By using a 3-megapixel sensor which is broken into multiple, overlapping 16 x 16-pixel squares (referred to as subarrays), a camera is capable of capturing a variety of angles in one frame. When the images taken by the multi-aperture device are processed by proprietary software, location differences are measured from each mini-lens, and then combined into a photograph containing a depth map. This procedure allows the same image to appear at different angles, provided the subject has depth to begin with (i.e., isn't a flat surface). Here's hoping this technology makes it into consumer products pronto, ASAP, and forthwith.[Via Wired]

  • Adobe develops 3D camera technology, dubs it computational photography

    by 
    Darren Murph
    Darren Murph
    10.09.2007

    At a recent event in France, Adobe showcased a prototype 3D lens that could essentially capture a scene from 19 slightly different angles simultaneously, giving photographers a lot more to work with when they return home for post-processing. Essentially, the firm boasted that by using this lens along with software designed to understand the 3D nature of the image, individuals could utilize newfangled tools such as a "3D healing brush" and make perspective shifts based on the different viewpoints originally captured. Dave Story, vice president of digital imaging product development at Adobe, called the technology "computational photography," and suggested that it could open up an entirely new window of image transformation opportunities. As always, these type of things are better explained in motion, so be sure and hit the read link to check out the video. [Via CNET]