"With every choice you make, ask yourself: is this a good choice, or is this a bad choice?" That's the sage advice we were constantly given as tykes -- and it's advice that replays in our feeble brains every day as we write news. Turns out it's also a piece of wisdom Apple's latest round of MacBook Pros
would be wise to heed, because currently, they're making some awful decisions about when to turn on that power-sapping NVIDIA GeForce GT 330M discrete chipset. Read on to see what we mean.
The idea, of course, is that the computer is supposed to automatically manage when it moves between the relatively meek integrated Intel graphics and the more powerful discrete silicon based on need, but "need" is a relative term. Intel's graphics sip power, but they're actually less capable than the 9400m integrated graphics from the last generation, so there's a real demand for discrete in graphically-intense situations. Apple's switching technology looks at how an application is built, picking up on what "Core" OS X technologies (like Core Image, Core Video, Core Animation, OpenGL) it's planning on using, and switches on the discrete GPU accordingly. Conversely, NVIDIA's Optimus technology
on the PC is based on an application whitelist that NVIDIA maintains which may not be as attractive or elegant as Apple's solution, but is
user-customizable on an app by app basis. You can also easily monitor when the card is on or off, and switch off manually, two things Apple decided it didn't want its own users to do.
Lucky for us, a MacRumors
forum member noticed that you can figure out which chipset's currently in use by looking at the display list in the Graphics / Displays section of your MacBook's System Profiler, so we did a quick survey -- and it ain't pretty. Photoshop and iMovie understandably put you into discrete territory, but Lightroom 3 beta 2, strangely, does not. Oh, and iPhoto '09 is more than happy to sip upon that sweet NVIDIA nectar. Core Image is probably to blame, but it's still an odd requirement for such a "consumery" app.
But it gets worse. Viewing QuickTime movie trailers on Apple's site in Chrome (a buggy experience, by the way) bumps you up to discrete, but doesn't bump you back down after you're done -- only closing the browser and opening it up again seems to reset it. Firefox and Safari keep you on integrated graphics the whole time -- as does downloading 480p or 720p content to your local QuickTime player -- but pulling up 1080p video locally kicks you into high gear (this sounds closer to the correct behavior, at least).
The most egregious thing we've seen, though, has to be Tweetie. Yes, little ol' Tweetie, that innocuous applet that stays out of your hair and shoots you a Growl notification every once in a while: as long as it's open, it's going to be rockin' the discrete graphics. The bottom line is that this is a great opportunity to underscore something we've said before, which is that Apple needs to bring back an ironclad disable option for the discrete graphics like it does with the older unibody models -- especially when battery life is supposedly Cupertino's single biggest push with these things. Of course, failing that, we've got to hope that some third-party dev out there has the wit, courage, and spitfire to craft a simple toggle utility.