UIGetScreenImage

Latest

  • devsugar: Accessing an iPhone camera capture session

    by 
    Erica Sadun
    Erica Sadun
    07.22.2010

    With the imminent demise of UIGetScreenImage, a number of readers have asked me how they can use the newer AVFoundation approach to access screen data in their iPhone applications. I went ahead and built some sample code that, when I'm finished messing with it, will be part of chapter 7 of my revised cookbook. I have uploaded the current version to github. It consists of a simple helper class that allows you to start and stop a capture session. You can request an image from this helper (namely, the last captured image from the buffer), which in this example is loaded into a central image view at the end-user's request. You can also ask it for a view with an embedded preview, using the current session. The example project adds that preview to the navigation bar. I threw this example together pretty quickly, and as always, I welcome suggestions and improvements.

  • devsugar: Farewell, UIGetScreenImage()

    by 
    Erica Sadun
    Erica Sadun
    07.21.2010

    Last December, Apple relented on the use of the otherwise private UIGetScreenImage() API. This function allows iOS developers to capture a screenshot on a device, regardless of the contents of the screen. In contrast, the standard SDK-safe approach to screen captures ([theView.layer renderInContext:context]) did not allow applications to access video layers, camera previews, or some OpenGL content. iOS developer Manfred Nerurkar writes on his blog that he was just called by his Apple Developer Relations contact and told that Apple has made an about face on this issue. Apps using UIGetScreenImage will no longer be greenlit for the App Store. Instead, developers will need to use standard Quartz methods (as mentioned aboved) or migrate their camera capture code to AVFoundation. As Nerurkar points out, this decision will force developers to refactor their code and, in doing so, limit screen capture to iOS 4.0 or later deployment. iPhone 2G users will not be able to use camera-centered utilities as iOS 4 AVFoundation functionality cannot be included on the earlier platform. Any 3G and 3GS users who have not upgraded from iOS 3.x will also be affected. That means that Nerurkar's iCamcorder and iWebcamera will lose a large part of their audience. Nerurkar's Drahtwerk firm is not the only one affected. Popular scanner apps such as Occipital 's Red Laser (now bought out by eBay) have a large early model/slow adopter iPhone user base, and if they have to be updated to use the newer methods they will be leaving those users behind. More discussions are ongoing at the Apple developer forums (behind the paid dev firewall).

  • Found Footage: Sudoku Grab goes Augmented Reality

    by 
    Erica Sadun
    Erica Sadun
    12.30.2009

    When it comes to the iPhone, Augmented Reality refers to applications that integrate live camera feeds with data generated either directly from those camera images or from related data linked to the user's real world location. Over the past year, we've seen a number of augmented reality applications appear on App Store and in the Cydia store, offering a way to connect real world visuals with enhanced data presentations. On the recent Augmented Reality front, Sudoku Grab [iTunes Link] developer Chris Greening has been inspired by Apple's recent decision to allow calls to UIGetScreenImage(), the computer function that allows iPhone developers to copy an image directly from the iPhone screen. By relenting on this issue, Apple has allowed programmers to pull live data from the iPhone camera, and process that data in real time. That opens the door to immediate image processing and visual presentation of data on top of that image stream. The above video demonstrates this ability by scanning for Sudoku boards. When it detects one, the numbers in question turn green. So how useful in general is this new SDK feature? Chris says, "It's a bit horrible to do anything really useful, you haven't got a direct feed from the camera so you have to do a bit of jiggery pokery if you want to draw on top of the camera preview and still have something usable." As you can tell from the video, his "jiggery pokery" is pretty well done. His real time scanning and enhancement of raw image data allows his detection routine to work with the camera's live feed to acquire new Sudoku boards. It's still early days on the augmented reality front. Greening's work represents just the start of where this technology can go. With faster processors and better screen access routines (UIGetScreenImage is a very slow call compared to the iPhone's non-public CoreSurface routines), real world integration is just going to get better and better.

  • Developers now can use private API for screen capture on iPhone, says Apple

    by 
    Joachim Bean
    Joachim Bean
    12.15.2009

    As Apple seems to be lightening up and accepting more applications using private APIs (including Ustream and others that stream video from the iPhone 3G), word comes that the review team is now officially allowing the UIGetScreenImage() function to be used in applications distributed in the App Store. An Apple forum moderator stated in the developer forums: "After carefully considering the issue, Apple is now allowing applications to use the function UIGetScreenImage() to programmatically capture the current screen contents." The function prototype is as follows: CGImageRef UIGetScreenImage(); Apple also states "that a future release of iPhone OS may provide a public API equivalent of this functionality." It's also noted that "At such time, all applications using UIGetScreenImage() will be required to adopt the public API." This function, which is a part of the Core Graphics framework, allows an application access to what's being currently being displayed on the screen. It's useful for things like capturing a screen shot, as our own Erica Sadun's BETAkit does to allow developers to send screen shots to a developer. It also allows streaming video from the iPhone camera, as an application like this captures what's being displayed on the screen from the camera, and records it or sends it somewhere. What other features devs are hoping to see opened up? There's things like general calendar access, Core Surface, and XMPP and app-settable timers that developers would like to take advantage of in their SDK apps. I hope this is a sign of what's to come for the iPhone SDK, and that we'll see more things like this opened up soon for App Store distribution. [via the Apple Developer Forums, dev membership required]