When it comes to the iPhone, Augmented Reality refers to applications that integrate live camera feeds with data generated either directly from those camera images or from related data linked to the user's real world location. Over the past year, we've seen a number of augmented reality applications appear on App Store and in the Cydia store, offering a way to connect real world visuals with enhanced data presentations.
On the recent Augmented Reality front, Sudoku Grab [iTunes Link] developer Chris Greening has been inspired by Apple's recent decision to allow calls to UIGetScreenImage(), the computer function that allows iPhone developers to copy an image directly from the iPhone screen. By relenting on this issue, Apple has allowed programmers to pull live data from the iPhone camera, and process that data in real time. That opens the door to immediate image processing and visual presentation of data on top of that image stream.
The above video demonstrates this ability by scanning for Sudoku boards. When it detects one, the numbers in question turn green. So how useful in general is this new SDK feature? Chris says, "It's a bit horrible to do anything really useful, you haven't got a direct feed from the camera so you have to do a bit of jiggery pokery if you want to draw on top of the camera preview and still have something usable." As you can tell from the video, his "jiggery pokery" is pretty well done. His real time scanning and enhancement of raw image data allows his detection routine to work with the camera's live feed to acquire new Sudoku boards.
It's still early days on the augmented reality front. Greening's work represents just the start of where this technology can go. With faster processors and better screen access routines (UIGetScreenImage is a very slow call compared to the iPhone's non-public CoreSurface routines), real world integration is just going to get better and better.