Look and feel
Still, we have a few gripes about the design in actual use that we wouldn't have minded Apple addressing in this round. First off, the sharp aluminum edges scream classy, but they also cut into the wrist when we're at an ergonomic disadvantage. We don't need pillows, but some mild concession to our human flesh would be nice. There's also the frustration of the too-close-together and too-few-in-number USB ports of the left side. It's one thing to have only two USB plugs, but when putting a thumb drive in one obscures the other you have a real problem. A standard HDMI port would also be nice, but we know Apple has its principles.
The other major problem we've had with previous generations of the unibody MacBook Pros is the use of the bottom plate as a secondary heat sink of sorts -- which turns our lap into a tertiary heat sink in the process. Happily we can say the situation has been much improved in the new version, at least in average use, though it's still possible to get the machine to uncomfortable temperatures with a little bit of effort. Sure, it's nice that there's hardly any fan noise ever, but at some point the laptop becomes hot on top as well, causing our left palm and wrist to sweat -- we'd say that's as good a time as any for the fan to kick into gear.
Keyboard, touchpad and screen
One of the biggest changes to the new models, as silly as it sounds, is the "inertial scrolling" Apple has added to the touchpad. This is very much like the motion on the iPhone (though of course you still use two fingers to scroll), or the motion available with some free-spinning scroll wheel mice, allowing the page to coast a little before slowing to a stop. It's completely intuitive, comfortable, and helpful, but if you loathe it for some reason you can turn it off in system preferences. According to Apple it's only a software change, but as far as this showing up on existing systems with glass trackpads, Apple's lips are sealed.
Sadly the review model we were provided doesn't have the new high resolution display option -- 1680 x 1050 instead of the standard 1440 x 900, which seems like a no brainer upgrade at $100 -- so we can't speak to that panel's quality. Still, our boring old pixel-poor display is still a pleasure, with 100 percent brightness often a bit much indoors, great color, and great viewing angle. Up next to our 6 month old previous-gen MacBook Pro, we'd say the colors are just a tad warmer and blacks just a tad deeper, but Apple claims the LCDs are at least specced exactly the same.
Performance, graphics and battery life
This is really where everything is at for these new machines. Apple's finally upgraded to the Core 2010 processors, which bring with them cores galore, along with Turbo Boost tech for automatic overclocking of the chip based on demand. Software utilization of multiple cores has come a long way, but it's still not perfect, so Turbo Boost switches off a couple of the extra cores when they're not needed to make room for overclocking of the remaining cores -- to pretty dramatic effect at times. We're testing out the top-of-the-line 2.66GHz Core i7 machine with NVIDIA GeForce GT 330M 512MB graphics, 4GB of RAM, and 500GB HDD, which retails for $2,199.
We aren't what you would call power users in a rendering-Pixar-movies sort of way, but we can still tax a machine just fine. We're usually running a couple browsers at once, frequently batch process piles of photos, edit videos the quick and dirty way in iMovie or QuickTime, and dabble with GarageBand from time to time. Slowdowns and hiccups are the norm on even the best machine with what we've got going on. Unsurprisingly, Core i7 hasn't made this all go away. Instead, it's just made it happen less. It's obvious that the machine can juggle just a bit more at once, launch apps a bit faster, pop open dialogues just a bit quicker, and so forth. Of course, this is all hard to quantify and is rather subjective, but we feel it.
To really bust on the processor more specifically we fired up some Flash video, pitting our new Core i7 Pro against an "old" MacBook Pro with a 2.66GHz Core 2 Duo processor and 4GB of RAM. Both machines are actually pretty strong when running a single bit of Flash at a time, but it's when a couple dozen tabs are open all slamming the processor at once that things get difficult. We fired up Hulu in 480p (the new Glee episode) and a 1080p Avatar trailer on YouTube with both machines managing to keep both videos playing smoothly. Once we added a second 1080p YouTube trailer on each machine, however, the Core 2 Duo machine began to choke, while the Core i7 juggled all three videos successfully.
On a more empirical front we ran some HTML and Flash tests using GUIMark. With Firefox 3.6.3 and Flash 10.0.45.2, we managed 21.31 FPS on HTML and 21.04 FPS on Flash on the new machine, while the old MBP only managed 15.99 FPS and 16.1 FPS, respectively.
We also tried out a 720p video export in iMovie. Of course the GPU gets called in for previewing live effects and whatnot, and we found the entire UI very responsive, but video exports are still a CPU affair, and the new machine thrashed the old one with a 5 minute export vs. 9 minutes.
Here are some more standard benchmarks:
|MacBook Pro 15 - 2010 (2.66GHz Core i7, NVIDIA GT 330M)||5395||228.22||218.96||486.60|
|MacBook Pro 15 - 2009 (2.66GHz Core 2 Duo, NVIDIA 9600M)||
Now to the issue of GPU switching. We had a long talk with Apple where they explained to us how this technology is different than Optimus (at least in the software implementation, it's obviously the same card underneath), and we're pretty impressed with what Apple has pulled off. Basically, Optimus turns on the GPU if its needed, and then runs both the Intel graphics and the discrete card simultaneously, pushing the GPU-produced imagery through the Intel chip before it hits your screen. Apple's solution actually switches fully between the cards seamlessly, with the Intel graphics on only in a power sipping mode but not in use at all for rendering when the NVIDIA GPU is in play. The other big difference is that Optimus detects its necessity based on a cloud-stored whitelist of apps that NVIDIA has, which could potentially become out of date or at least have difficulty in keeping up with app releases (though users get the flexibility of manually enabling apps). Meanwhile, Apple's solution is based on deeper OS-level stuff, with OS X figuring out what sorts of technologies an app is going to call on (like OpenGL, for instance) and turning on the GPU accordingly.
Still, there are drawbacks to even Apple's approach. For instance, a heavy hitter like Photoshop will turn on the GPU, even if you just leave it on in the background while you're working with some text. If you really want to sip power you'll have to quit any applications that use the GPU when they're not needed. The problem with that is that Apple isn't providing any way for people to know if an application is activating the GPU or not. We're sure there will be a 3rd party utility soon enough (Apple even agreed with our assumption), and we even understand why Apple might want to hide this info from Joe User, but we know plenty of power users who wouldn't mind having this info surfaced.
On a similar front, Apple has really outdone itself in restricting your GPU flexibility. There are only two options for automatic graphics switching: on and off. If it's on, it'll act as we've described, if switching is off then the GPU will run at all times. Apple says this only knocks the battery life down from 9 hours to 8, but since in real life we're not getting anywhere close to 9 hours of use, we're pretty sure we'd rather hang on to that "bonus" hour of juice at times and run integrated only. Part of Apple's reasoning is the fact that Intel integrated graphics are no match for the GeForce 9400M chip of last generation, but when you need to squeeze every last minute out of your battery, flexibility is key.
So, how does all this added number crunching play out in battery life? Well, it's confusing, that's for sure. Since we don't have a reliable way yet of knowing when we're tapping into the GPU or not, it's hard to know exactly if what we're doing is helping or hindering battery life. Still, in regular use we're certainly not bumping past that magical 6 hour mark, and we'd have to really work for Apple's quoted 8-9 hours of battery. Through a day of "regular use," which involved some benchmarking and some iMovie, but mostly just web browsing and typing, with screen brightness hovering around 60-75 percent, WiFi on and an hour of Bluetooth we managed four hours and 34 minutes of juice. The video rundown test actually fared better, with five hours and 18 minutes of SD video with the screen at 65 percent brightness, WiFi and Bluetooth on. Those Intel graphics sure do sip power! Obviously what we assumed was "casual" use isn't so casual, and we'll be tweaking our usage of the laptop accordingly to figure out how much juice we can get -- until someone comes up with a hack to switch to integrated only, of course.
|MacBook Pro 15 - 2010 (2.66GHz Core i7, NVIDIA GT 330M)||5:18|
|Sony VAIO Z (2.53GHz Core i5, NVIDIA GT 330M)||4:25|
|HP Envy 15 (1.6GHz Core i7-720QM, ATI HD 4830)||2:00
* Standard definition video rundown test, brightness 65 percent
It must be said, Apple's battery life is really industry leading when it comes to a Core i5 or Core i7 machine with discrete graphics, but compared to their already high benchmark from last generation, it seems less impressive.