DoingTheMath

Latest

  • Doing the Math: WWDC capacity

    by 
    TUAW Blogger
    TUAW Blogger
    05.07.2012

    Flickr: mediafury Apple says there are 248,000 registered iOS developers in the US alone on its job creation page. This number leaves out: Developers registered in other countries. Mac Developers who may not have iOS dev accounts. Development team members who use a single account. Assume that represents just a conservative 20% undercount, that would put the number of devs world wide (as in "Apple World Wide Developer Relations") at 300,000 more or less. Number of seats available at WWDC: 5000. You do the math. WWDC ticket holders are roughly speaking the 1%.

  • Retina display Macs, iPads, and HiDPI: Doing the Math (updated)

    by 
    Richard Gaywood
    Richard Gaywood
    03.01.2012

    Love Apple gear? Like math? TUAW's Doing the Math series examines the numbers and the science behind the hardware and software. The rumourmill has been busy lately with claims that we might get "Retina display" Macs soon -- and of course, a Retina display iPad 3 on March 7, probably, maybe, definitely. For an example of the sort of speculation, consider Bjango developer Marc Edwards, who tweeted: "Retina 27" Thunderbolt display: 5120×2880 = 14,745,600 pixels. 4K film: 4096×2160 = 8,847,360 pixels. Retina iPad 3: 2048×1536 = 3,145,728 pixels". This prompted me to dust off my Retina display iPad post from a year ago and revisit the mathematics I applied there to dig a little deeper into what a Retina display Mac might entail. Is Edwards right -- would a Retina display Thunderbolt display really need almost 15 megapixels? Isn't this all just marketing? Before I launch into a long-winded diatribe ("surely not!" -- everyone who's ever read any of my other TUAW posts), I need to address a surpisingly common point of view. Some people say that as "Retina display" was a term Apple made up, it can mean whatever it wants it to mean. If Apple wanted to, the theory goes, it could just declare the current iPad to be a Retina display and be done with it. I think this argument is asinine. Firstly, although Apple invented the term out of whole cloth, it does offer a definition: "the Retina display's pixel density is so high, your eye is unable to distinguish individual pixels." That has meaning, and if Apple were to weaselly dilute the definition for the sake of marketing some future product I think we should absolutely hold its feet to the fire. Secondly, this isn't just about Apple. High DPI screens are starting to appear on other devices, like this Android tablet from Asus. The precise phrase "Retina display" might belong to Apple but the advantages of high resolution screens do not. As this is an emerging trend across the whole industry, it behooves us to strip away the marketing pixie dust and take an objective look at what this technology can offer. Defining "Retina display" So what does it mean to say that a screen's individual pixes are indistinguishable? The launch of the iPhone 4 and the first Retina display was, of course, accompanied by a jump in the screen resolution from 480×320 to 960×640 -- from 163 pixels-per-inch (ppi) to 326 ppi. This in turn lead many people to label some arbitrary resolution as "Retina display" -- typically 326 ppi itself, or 300 ppi. The latter number is a common rule-of-thumb baseline in the print industry for "photo resolution". It's not that simple, however. Hold a small-print book at arm's length. Notice how it's hard to read the text. Now bring the book up to a few inches from your nose. Notice how much easier it is to read now. Clearly, if Apple is defining a "Retina display" as "one where users can't see the pixels" then any discussion of whether a given display qualifies or not needs to take into account the distance between the screen and the user -- and that differs according to the device. An iMac on a desk, a MacBook in your lap, and a hand-held iPhone all have different viewing distances. So, how do we determine how small a pixel has to be to be bordering on invisible? To answer this we need to think about subtended angles. Consider the following scenario: The viewing angle in this diagram, a, is called the angle subtended by the inter-pixel spacing, s. Whether or not a given detail is too small to be discerned by the eye is down to the size of this angle. This is how the size of an object is related to the viewing distance -- as you move an object of a given size closer or further away from the eye, so the size of this angle changes. Conversely, at given distance, a larger object will also subtend a bigger angle. The size of the image on the retina is intrinsically derived from the object size and the viewing distance, linked by this formula: So what subtended angle is too small to see? The average person has 20/20 vision. This was historically defined as the ability to read letters on a standard eye chart that subtend 5 arcminutes of angle (an arcminute is 1/60th of a degree). What does that mean in pixel terms? Consider that just about the smallest legible fonts, Tinyfont by Ken Perlin and Tiny by Matthew Welch, uses five pixels of height (including descenders for Tiny) for each letter. This suggests the smallest resolvable detail for an average eye is around one arcminute. Indeed, one arcminute is an accepted value amongst academics for the resolution limit of a typical human retina. Retina-ness of Apple's current displays With the data above in mind, and applying the mathematics from my previous post, we can take some typical viewing distances for different Apple devices, combine it with the screen size and resolution, and calculate how close the screen comes to the definition of a Retina display we have arrived at above. You can view a Google spreadsheet that shows the details of how this data is calculated. Update, 2012-03-02: I've had quite a bit of feedback that many people sit closer to their devices than I do. I'm not sure if it's personal preference, or because I've used multi-monitor for many years (my 27" iMac is flanked by a 26" Samsung monitor, so I have to sit a little way back to fit it all in my vision). Either way, I've added a few rows to that spreadsheet that aren't shown on the table above to reflect these scenarios. Update 2, 2012-03-04: The original version of the above table contained an error; I had forgotten that the screen sizes of the MacBook Air 11", 13" and MacBook Pro 15" are actually 11.6", 13.3", and 15.4", respectively. I used the wrong version in the calculation. This has now been fixed. This changes some of the pixel-per-inch figures slightly. Just for fun, I threw in a couple of non-Apple devices for comparison -- a 50" TV at a distance of six feet, playing back a Blu-Ray and a DVD; and the announced Asus Transformer Prime Android tablet, which has a 1920×1200 display. This table shows some things that surprised me. Firstly, it shows that Apple's definition of Retina display aligns quite closely with my mathematic derivation. The iPhone 4 screen at a typical distance of 11" is just barely above the threshold for a Retina display. I believe this justifies my methodology. Secondly, it repeats my previous conclusion that a pixel-doubled iPad running at 2048x1536 is easily enough definition to count as a Retina display -- even at a 16" viewing distance, which is on the close side from my experimentation with an iPad and a tape measure. Similarly, that Asus tablet is a Retina display too. It also shows that many current Mac displays are a lot closer to Retina display levels than you might have thought. The 27" iMac at a distance of 28", a 17" MacBook Pro at 26", an 11" MacBook Air at 22" -- these screens all have pixels small enough to border on invisible. Furthermore, the 480×320 iPhone screen is notably worse than everything else Apple makes today, at 53% of a Retina display. Even the second-worst 1024×768 iPad screen has finer detail at 61%. The worst Mac display is the 24" iMac at a distance of 28", at which distance its pixels are one-third too large to be individually indistinguishable. Finally, this also shows why BluRay looks so good. On a largish TV at a shortish distance (50" at 6'), a 1080p image is at 92% of Retina level, whereas a DVD is a downright poor 36%. There are two very important points here. The first is that in order to achieve, or even handily exceed, the threshold for a Retina display, Apple does not need to double resolutions on most of its displays. Far from it. It would suffice to boost a 27" Thunderbolt Display from 2560×1440 to something around 2912×1638. (But note that there could be image quality issues from this -- see "The pixel doubling argument" below.) The second point is that people shouldn't get their hopes up for how much better a Retina display Mac would be compared to the current offerings. The iPhone 4 was a huge step forward from the iPhone 3GS mostly because the 3GS's screen was comparatively poor. Existing Macs have much better screens to start with, so any improvement will be much more modest. Looking beyond one arcminute From the above, you might think that there is hardly any reason to Apple to change anything, because the benefits of higher resolution screens are so modest. But clearly HiDPI mode exists, and specialist medical imaging screens are between 508 and 750 ppi. What's the benefit to these high pixel densities? The answer is that our definition of the limits of human vision -- details that subtend an angle of one arcminute -- is rather simplistic. There's a lot more to think about when considering how real human vision interacts with computer display technology, including atypical viewing distances, different sorts of patterns, and so forth. Reading words, for example, is possible at smaller sizes than reading random letters, because your brain has more context to guess at the characters. Your brain is a sophisticated pattern matching tool and it will use information from the surroundings to try and interpret details your eyes can't quite make out clearly. Here's a number of test patterns for you to try this out on your own display. If you want to try this on an iOS device, you need to get the appropriate file for your device -- iPhone or iPad -- and save it to the Camera Roll. This is because iOS will helpfully try and zoom and pan images but we want to ensure that one pixel in the test image takes up one pixel on your display. Once you have them in the Camera Roll, view them full screen through the Photos app with your device in the portrait position. If you compare your Mac, iPad, and iPhone, you should see quite a difference in how well each screen performs. The pixel doubling argument Rene Ritchie for iMore makes a solid argument for why an iPad retina display must be pixel-doubled -- i.e. 2048×1536 -- and not some intermediate resolution (just as was the case for the iPhone 4 before it). Anything else means every single existing app either has to re-scale art assets -- resulting in a fuzzy display -- or let them appear at a different size on-screen -- resulting in usability problems as the tap targets are resized. This is because every single existing iPad app is hard-coded to run full screen in 1024×768. The situation is fuzzier on desktop, however. Apple's current displays already vary between 92 and 135 pixels-per-inch. Users are more tolerant of UI element resizing, within reason. Consider the 109 ppi 2560×1440 27" Thunderbolt display, and let's suppose Apple wanted to Retina it up. It could up the resolution to 4192×2358 -- which works out to 178 ppi -- and achieve a display with finer details than the iPhone 4. This is one-third less pixels than the native pixel-doubled resolution (which would be 5120×2880). UI elements would look proportionally larger -- but no more than they do on the 24" iMac display today, so it wouldn't look clumsy or odd. Update, 2012-03-02: David Barnard of App Cubby wrote a great followup post with some mockups comparing a 27" 168 ppi screen in HiDPI mode (at a resolution of 3840×2400) and the current 109 ppi one. He also makes an interesting point that he find Apple's more dense modern displays harder to use: What you should notice is that the text and UI elements are physically smaller on the current 109ppi iMac than they'd be on the hypothetical 84/168ppi 27″ iMac. This may be frustrating to some users, but I actually prefer my old 94ppi 24″ Cinema Display to any of Apple's higher PPI displays. I like that the system default 12pt text is larger. The sacrifice is in the usable workspace, and that's a matter of taste. I've been hearing from more and more people on Twitter that the 11″ Macbook Air is surprisingly usable with OS X Lion, even though the workspace is a scant 1366x768 pixels. Wrapping up Hopefully, I have convinced you of several things in this post. "Retina display" carries more meaning than pure marketing. The definition of what is, and what isn't, a Retina display must consider viewing distance. The improvement you'd see from a Retina display Mac is significant, but less than the improvement the iPhone 4 offered over the 3GS. A 2048×1536 iPad would be a Retina display and would look quite a bit better than the current model (but, again, be less of an improvement than the iPhone 4). Still not convinced? Sound off in our comments! I'd like to thank fellow TUAWers Brett Terpstra and Erica Sadun for helping me with the Retina Tester graphic.

  • AT&T's iPhone "sales" versus "activations": Doing the Math

    by 
    Richard Gaywood
    Richard Gaywood
    01.27.2012

    Love Apple gear? Like math? TUAW's Doing the Math series examines the numbers and the science behind the hardware and software. Several sites -- including TUAW -- reported yesterday that 80% of all smartphones AT&T sold in Q411 were iPhones, based on AT&T's quarterly earnings report. On closer inspection, however, there's a subtle but important detail that we overlooked in AT&T's wording. It reported "9.4 million smartphone sales" but "7.6 million iPhone activations" (emphasis mine). So, consider the following series of events. Alice buys an iPhone 3GS back in 2009 on a two-year contract. In late 2011, she treats herself to a new iPhone 4S -- that's both a sale and an activation for AT&T. She gives the 3GS to her husband, the long-suffering Bob, who can finally ditch his flip phone. Bob needs service though. His "new" 3GS is locked to AT&T -- unlike in many other countries around the world, most American carriers won't voluntarily unlock even out-of-contract handsets. Even if it were unlocked, though, it's not compatible with either the CDMA networks used by Verizon and Sprint, nor the oddball 3G frequencies used by T-Mobile USA. Finally, AT&T refuses to support iPhones on its pay-as-you-go GoPhone plan (although if Bob read TUAW he'd know he could work around this). So, with no other choices, Bob rings up AT&T and starts an iPhone contract so he can use the old handset as more than just an oddly-shaped iPod touch. At the end of this process, AT&T has closed one new sale -- but counted two activations, one for Alice's new iPhone 4S and one for Alice's old iPhone 3GS in Bob's name. This means the 7.6 million activations includes some double counting, and can't directly be compared to sales. We reached out to AT&T's Seth Bloom to confirm whether our reasoning was true. He said "You're right that activations are a bit different than sales -- and activations includes things like gifted iPhones as you suggest." However, he also added that "In this quarter, the number of activations from things like gifted iPhones doesn't change the math much. We aren't sharing a number, but gifted phones is a relatively small portion of total activations." How much might a "relatively small portion" be? Let's revisit those numbers: 9.4 million smartphone sales and 7.6 million iPhone activations. Suppose that 10% of all those iPhone activations were to used handsets. In other words, out of all of those brand-new iPhone sales AT&T made in the last three months, about one in ten of them (a virtual cookie to any commenter who spots why I had to say "about" there) were made to a person who a) already had an older iPhone and b) then proceeded to sell or give that handset away to someone else, who reconnected it back to AT&T's network. That would mean that AT&T activated 6.84 million new iPhones and 760,000 old ones. In turn, that means that the iPhone took 73% of AT&T's overall smartphone sales. The other 27% will be split between Android, Blackberry, and Windows Phone 7. More generally, we can plot a graph of how the iPhone's market share changes as a function of the recycle factor or the proportion of activations which went to reused handsets: If we revisit AT&T's statement we can also see that "82 percent of postpaid sales were smartphones." This means, remarkably, that unless 25% of iPhone activations went to reused handsets (which seems unlikely in light of Bloom's comments) then over half of all contract phones AT&T sold were iPhones. This story is repeated on Verizon too. The bottom line is: Apple kicked all kinds of posterior in the smartphone market during the last three months of 2011. Can it continue to do so in 2012? It might not be able to maintain quite this stupendous a lead. The timing of the iPhone 4S launch (in autumn, versus the previous summer iPhone introductions) likely boosted sales by causing some greater-than-usual pent-up demand. Supporting this hypothesis, Tim Cook admitted that sales of the older models waned between July and September. It'll be fascinating to see what this massive quarter does to the overall smartphone market share of iOS versus Android in the coming months.

  • Doing the Math: At $29.99, Mac OS X Lion was WWDC's most expensive product

    by 
    Chris Rawson
    Chris Rawson
    06.06.2011

    Our own Dave Caolo pointed out something that took the rest of the TUAW team aback: at US$29.99, Mac OS X Lion was the most expensive product discussed at WWDC today. It's not as though the next version of the Mac's operating system had a lot of pricing competition at the keynote. iOS 5 will be a free upgrade to users with supported hardware, and iCloud's services -- which used to cost $99/year under MobileMe -- are all completely free. In fact, other than Lion itself, the only thing Apple announced at WWDC that costs anything at all was iTunes Match at $25 a year. One of the major anti-Apple memes over the lifetime of the Mac has been that Apple's products are far more expensive than those of its competitors. While there are arguments both for and against that line of thinking for Macs and equivalently-configured PCs, the iPad's pricing compared to other tablets' blows that argument out of the water, and Apple's software prices undercut those of Windows by an astonishing margin, as demonstrated in the graphic above. Windows 7 comes in a spread of flavors, while Mac OS X Lion comes in only two: the standard $29.99 user edition and an upgraded server edition that costs $50 more. Both will be downloads from the Mac App Store, and while there's no official word yet, based on a cursory reading of the current terms and conditions, it seems that both Lion and Lion Server Edition will be installable on up to 10 machines associated with a user's iTunes account. So our graphic is wrong in one sense: while you could buy multiple copies of Lion for the same price as the equivalent Windows software, you don't actually have to. If anything, this makes Lion an even more economical prospect than Windows. Even if you want to make the argument that it'd take a Server Edition upgrade to put Lion's feature set on parity with Windows 7 Ultimate Edition (an assessment with which we'd politely disagree), Windows 7 is still only installable on one machine. Therefore, even with "Lion Server Edition" costing a total of $80, that's $80 for a 10-machine license under the current terms and conditions versus $220 to install Windows 7 Ultimate Edition on one. Put another way: for the amount of money you'd pay for a single-machine license for Windows 7 Ultimate Edition, you could install Mac OS X Lion and its server tools on 20 machines and still have 60 bucks left over. If you're like us and you think Lion doesn't need the server tools to be on parity with Windows 7 Ultimate, you could install Lion on 70 machines and buy yourself a six-pack for the same price as one Windows 7 Ultimate license. Apple charged $129 for Mac OS X Leopard and older iterations of its operating system, which were still considered bargains against the pricing of equivalent Windows packages. But Lion's incredibly low cost compared to that of Windows merely demonstrates what we've known all along: Apple is, at its heart, a hardware company. It makes money off of its hardware, but the only purpose of the software is to make the hardware sing. iTunes? Free. iCloud? Free. iOS? Free. Mac OS X? 30 bucks. Microsoft, on the other hand, is primarily a software company dependent on hardware makers to run its software. Xbox 360 and some minor pilot projects aside, Microsoft makes an overwhelming majority of its money off licenses of Windows and Office editions. With that in mind, it's little wonder that Microsoft's software costs so much more... or that Apple is currently cleaning Microsoft's clock financially.

  • Malware, Macs, and crying wolf: Doing the math

    by 
    Richard Gaywood
    Richard Gaywood
    05.14.2011

    Love Apple gear? Like math? TUAW's Doing the Math series examines the numbers and the science that lie behind the hardware. The contentious subject of Mac security has been back in the news in recent weeks following the emergence of a fake antivirus package called MacDefender (also known as Mac Security and Mac Protector) that managed to steal a number of users' credit card details, and a new piece of "crimeware" called Weyland-Yutani BOT which allows non-technical hackers to easily create password grabbing webpages that specifically target Mac browsers. This prompted a fresh round of "the Mac is under attack! Malware will drown us all! Exclamation!" blog posts, followed by the usual backlash against them. On the alarmist side, Ed Bott wrote "Coming soon to a Mac near you: serious malware", predicting doom, gloom, and dogs and cats living together. The case for the defence was eloquently made in an article entitled "Wolf!" by Mac uber-blogger John Gruber where he simply collected assorted "Mac malware is inevitable" quotes from prominent analysts... going back to 2004, and all clearly unfulfilled in the sense of widespread attacks or exploits in the wild. Bott responded with a thoughtful post where he made a more reasoned case that malware for Macs really is inevitable in the long run, regardless of how inaccurate previous predictions have been. So who's right, and who's wrong? Is it time to run to the hills or are people just sounding the gong of panic unnecessarily? In this post I'm going to try and dive a little deeper into the issues surrounding Mac malware, hypothetical and real, and separate the headlines from the facts.

  • Keeping SSDs in TRIM: doing the math

    by 
    Richard Gaywood
    Richard Gaywood
    03.27.2011

    Love Apple gear? Like math? TUAW's Doing the Math series examines the numbers and the science that lie behind the hardware. One of the new features we first saw in the developer beta of Mac OS X Lion back in February is long-overdue in this correspondent's humble opinion: it finally supports TRIM on solid-state drives. TRIM (which, despite the capital letters, isn't an acronym) is a way to speed up SSD access by performing important housekeeping tasks in the background or on file deletes, rather than leaving it until the user is writing data to the drive. Since then, TRIM has also appeared in 10.6.6 for new Macs with Apple-supplied SSDs only, and with third-party tools, it's now possible to get TRIM running on any SSD under 10.6.7. This raises the question: what exactly is TRIM, and why does it matter? If you've been wondering what this seemingly arbitrary abbreviation is, and why it matters, then I'm here with my best Science Hat on to remove all that wonder (as we scientists so often do) and replace it with cold hard fact.

  • The 4-inch iPhone display: Doing the math

    by 
    Chris Rawson
    Chris Rawson
    02.16.2011

    Love iPhones? Like math? More in our Doing the Math series, examining the numbers behind the hardware. Recent rumors suggest the iPhone 5 might have a 4-inch screen, slightly larger than the 3.5-inch screen in the iPhone 4. I was skeptical of this rumor at first, because I thought boosting the screen size would require Apple to increase the overall size of the handset, too. I fiddled with some numbers and determined that while it's theoretically possible for Apple to put a 4-inch screen in the iPhone 5 while retaining its current 3:2 aspect ratio, it's unlikely to happen for a few reasons. If Apple increases the iPhone 5's screen size to 4 inches but keeps it at the 3:2 aspect ratio all iPhones have had thus far, the overall dimensions work out to 3.32 x 2.22 inches. But the iPhone 4 handset's overall width is only 2.31 inches; if Apple wants to keep the iPhone 5 around the same overall size as the iPhone 4, that only leaves 0.045 inches (1 millimeter) on either side of the display. That's not a whole lot of clearance between the screen's edge and the edge of the handset itself; in fact, it essentially means the screen would cover the entire width of the front faceplate. Apple could work around that issue by slightly increasing the iPhone 5's width, but there's another problem. If Apple increases the screen size to 4" but retains the same 960 x 640 pixel dimensions, the PPI (pixels per inch) value drops to about 289 ppi -- well below the iPhone 4 Retina Display's 326 ppi, and just barely at the threshold of a "Retina Display" level of quality. To maintain 326 ppi, the pixel dimensions on a hypothetical 4-inch, 3:2 screen must increase to the neighborhood of 1080 x 720, plus or minus a few pixels. App developers would then have three sets of resolutions to support for the iPhone instead of two, and scaling from 960 x 640 to 1080 x 720 wouldn't be anywhere near as simple as the pixel-doubling that got developers by in the early days before they were able to scale apps up from 480 x 320 resolution. Worse, any apps kept at a 960 x 640 resolution and "zoomed" to fill the new pixel dimensions would probably look pretty terrible; instead of doubling the pixels as happened in the early Retina Display era, the scale works out to 1 1/8 "zoom." What if Apple didn't stick with a 3:2 aspect ratio for the iPhone 5's screen and switched to something different? Read on to find out.

  • iPads and Retina Displays: Doing the math

    by 
    Richard Gaywood
    Richard Gaywood
    01.18.2011

    Over the weekend, we saw the swirling rumors around the specs for the (presumably inevitable) iPad 2 start to come together. One of the most intriguing suggestions, which Engadget claims to have a reliable source for (and MacRumors some corroborating evidence to boot) is a higher resolution screen to match the iPhone 4's Retina Display -- specifically, doubling in both directions, changing from 1024x768 to 2048x1536. This has prompted some discussion around exactly what Retina Display means, and whether this would count. The iPhone 4's screen is a mammoth 326 pixels-per-inch (ppi), whereas this rumored new iPad resolution is a somewhat lesser 264 ppi -- quite a bit less. However, I believe it's just as valid for Apple to call this a Retina Display as it was to call the iPhone 4 screen, and after the break I will explain why with some hopefully convincing mathematics. Firstly though, it's important to stress that these are only rumors and that 2048x1536 is an incredible number of pixels -- 3,145,728 of them, in fact. That's only 17 percent less than the 27" iMac or 27" Cinema Display, and it's 52 percent more pixels than a 50" 1080p television screen! This makes the screen expensive to make; it places greater strain on the graphics chipset to drive the screen, which makes that more expensive, too; it won't do the battery life any favors either. All of this, to my mind, suggests this is one rumor that might come down to wishful thinking. As John Gruber said: "I'll believe it when I see it."