Dr. Charlie Miller -- a man who has been covered extensively here at Engadget -- snagged a doctorate in Mathematics from the University of Notre Dame. He spent five years working on cryptography for the National Security Agency. And, after heading into the wilds of security analysis, he was the first to find a bug in the battery of the first MacBook Air, various bugs within Mac OS X and the Safari web browser and assorted bugs within iOS itself, all while racking up thousands of dollars in hacking contest prize money.
Last week, this came to a head, as Miller created a controversial proof of concept application that both proved the existence of an iOS security hole as well as got him expelled from the App Store's developer network. Given that he's driven Apple Inc. somewhat nuts over the past few years, we sat down with the good doctor to see how he felt about Apple, iOS, security, technology, sandboxing, the pros and cons of modern security and the ups and downs of one of the weirdest career paths for any aspiring technologist today. Join us after the break for the full interview in both textual and audio form.
Your profile says that you completed a PhD in mathematics at Notre Dame and then went to go work in the bowels of the government, specifically the NSA for five years. What got you into software security and how did this lead to hacking?
So when I got to the NSA, I started out as a Cryptographer, and my PhD is in Math, so this was a good match there. But as it turns out, when I got there, I didn't really like that, so one of the cool things about the NSA is that they'll train you in whatever they have a need for and that you want to do. So I thought it was cool, and they trained me up. I worked for them and when I came out, I got a job, sort of a more traditional software security job that I could actually talk about.
What prompted you to first enter the PWN 2 OWN contest back in '08 and essentially take on the first generation of MacBook Air?
These are just computers, they have bugs, and they have exploits too.
Believe it or not, back then, people didn't believe Macs were vulnerable to anything. So I would say to people who would listen, "Hey, these are just computers, they have bugs, and they have exploits, too." But they really didn't see anything like that... I would say that and no one would really believe me. One of the main things I wanted to do was to show that Macs were vulnerable, and at the time, actually quite a bit more vulnerable than say a comparable Windows system. So that's why I entered... to sort of prove a point. Flaws in that security, and unfortunately it didn't work. So if you would read the comments on that, people were posting on articles about it, no one really believed it. They were like "Oh, well sure, if you give him physical access," and of course I didn't have physical access. Basically, I did it to prove that Macs were as vulnerable as anything else, and actually more so at the time and no one really believed me. And that's why I went back the second year.
Where were you finding the vulnerabilities back then?
They were all in the browser, Safari. I think the first one were a bunch of bugs, and then, the next year in 2009 was a bug and the way they parsed fonts, which again was accessible through the browser. The good news was that after the second time the people kind of started to believe that I wasn't just full of it, that these things really were as vulnerable as other devices and actually more so.
A lot of people are kind of saying that your efforts are intentionally focused on Apple, is there something inherent about the company's products, philosophy or are these just pieces of your work that are tending to garner more notice?
It's a direct bonus for me when I find bugs and they get fixed.
One of the main reasons is that I use Apple products. I'm talking to you on an iPhone right now, and my computer is a MacBook. I'm familiar with them, and it's a direct bonus for me when I find bugs and they get fixed... then my stuff that I use is more secure. That's part of it, the other thing is back when I started in 2007, Apple products were a lot easier to find flaws in than Windows, and so it was easier to exploit them because they didn't have the same anti-exploit technology that Windows had. But now, that's not really the case anymore, it's sort of an even playing field, but back then it was quite a bit easier and so, bang for your buck, I could write three to four exploits for OS X in the time it'd take me [to write] one for Windows. It was more of an investment of my time back then, but not anymore.
As far as where the security was, back when you first started, how much more open and vulnerable would you describe OS X as being, as opposed to the Windows operating systems at that time?
They were quite a bit. If you look at something like Vista, or Server 2003, those basically have the main anti-exploitation technologies and for the equivalent on OS X, you'd have to wait for Lion, just a few months ago. As an example, when I won the very first one in 2008, there was no DEP (Data Execution Prevention), so you could just write to memory and execute, exploit and jump right to it -- very trivial to exploit. The actual exploit I used in 2009 would not have worked. I would have not have had to do anything fancy at all to write a very, very unstable exploit, but the fact that I could just write to memory and execute it allowed me to do it, and neither of these things were possible on the Windows side. So, the good news is that things have changed. If you look at Lion, Lion is as hard to write an exploit for as it is for Windows 7, so they're very caught up. Back then, they were very easy pickings.
Based on what you've seen in the Mac OS X 10.6 and 10.7 versions, what has Apple done security wise that's been a step in the right direction?
They fixed a lot of bugs for one thing, but that's hard to measure, to say how much progress they've made compared to other operating systems. But some good things they have done for engineer that are measurable, are the things that I've talked about. They've added anti-exploitation technologies: ASLR, which is memory organization and DEP, which is separating code from data. That's basically the industry standard for making exploits hard. Lion has also introduced sandboxing in Safari. So now, the thing I usually break into is harder to exploit and now when you do it, typically you'll end up in a process within the browser. That's sandboxed, which makes it at least twice the work because it requires two exploits as opposed to one.
What has been noted most prominently are these iOS shortcomings, do you believe Android has similar weaknesses as iOS, and are there ways in which Google's operating system is more prone to attack?
I pick on iOS a lot, but Android OS is actually in worse shape than Apple is in terms of mobile security.
I pick on iOS a lot, but Android OS is actually in worse shape than Apple is in terms of mobile security. They've just caught up in terms of anti-exploitation technologies. But the thing that really makes Apple different is the way that they guard against malware, and that's what my most recent research is about. On the Android side, it started with a different philosophy: you can download anything you want, and you can run it. The problem with that is that you can have malware for it and you can download malware and run it.
And then you're sort of in the same boat with Windows for years -- you have to be careful with what you download. On the Apple side, they went with the approach that Apple has all the control. What that means is, you can't download any program you want and run it, you can only download from the App Store. In order to do that, Apple has to take a look at them, approve them, and sign them. The bad news is that you can't just download anything you want, but the good news is that you can feel pretty safe download and running programs and it won't do something bad. While you've seen some malware for Android, you haven't really seen any for iOS because of exactly this reason.
Getting back to Apple, what sort of security weakness do you think are really out there for iOS, even with iOS 5.0 and 5.0.1 in its second beta at this point and time?
There's obviously the one that I am talking about, which has to deal with the actual code and that basically breaks what I was describing before, which is the way that they protect from malware. If we jump ahead a month, or however long it's going to take for them to patch that, it's actually really solid. I mentioned it has the anti-exploitation technology that's lined up, it has some encryption, although it's not that great, but at least there are ways to set it up for remote wipe and remote lock, which are important for mobile devices because you're way more likely to lose it in the backseat of a taxi than you are to actually get attacked by someone.
In your opinion, is it worth jailbreaking either an iOS or an Android device, or do you feel that the risks outweigh the benefits that you are going to get from this?
It doesn't just turn off good things; it turns off a lot of the security, so you really have to want whatever it is that you need with jailbreak, in order to give up all this security.
Definitely when you jailbreak an iOS device, you've really affected the security of the device. You turn off code signing, which means that you can download anything, but it also means you can download anything, meaning it can be bad code too. That's the one thing everyone knows about, but protection means a lot more, too. Jailbreak adds a bunch more code, it starts running things as root, like as an example, you might have an SSH server running as root, or other programs you install running as root, which is of course a higher level privilege, and they're very careful on a stock iOS device not to let you do that.
So you've got things running, and the new programs that are running, aren't running in the sandbox anymore. Jailbreak turns off a lot of the memory protection, so the thing that I was talking about before, the flaw I found this time; you get automatically on a jailbroken device. It doesn't just turn off good things; it turns off a lot of the security, so you really have to want whatever it is that you need with jailbreak, in order to give up all this security. For most people, I can't imagine it's worth it, but if there's something you just have to have, then I guess it's worth it. For me, for research purposes, I usually have to jailbreak my devices, but right now my phone is not jailbroken.
In terms of just overall security, Mac or desktop, if you had to pick any company that's doing it the right way in your opinion -- it could even be webOS -- who do you feel is doing it right?
Hrmmm. That's a good question. It used to be that there were enough differences that allow you to differentiate. When I got into the game, OS X was partially behind, and you can prove that to someone who wouldn't listen to you. But now, most OS' are close enough in that level of security. It's really nit-picky to try and separate. They're all pretty good. I hate to say one over the other because the things that you see now and the differences are so small compared to what they were, that they're all pretty decent at this point.
Getting to the most sticky point in this moment in time, do you feel that Apple's removal of your app store developer account was excessive. What would you have done, if you were in their shoes?
So regardless of the consequences, I thought I needed to let people know about this, just so they can make informed decisions about the security of their devices.
Whether it was excessive, I don't know. I certainly broke the terms of service and they had the right to remove my account, but for me, I didn't really consider or I didn't really care what they would do -- I thought it was important enough to let people know about these problems. I couldn't sit and let people know that you could be potentially downloading malware from the App Store, when up until now, everyone -- Apple, myself -- considers it as, "basically you don't get malware on iOS, right?"
So regardless of the consequences, I thought I needed to let people know about this, just so they can make informed decisions about the security of their devices. So they still download everything from the App Store, great. If they want to download my apps, that would've been fine. If they're worried about their security, then they know to sort of hang out and wait till the patch comes out, and then I can start downloading. And even then, I know in the back of my head that there's going to be another bug, like the one Charlie Miller talks about, and I need to be a little careful. So as far as getting my account removed, it's a bummer and I wish they wouldn't have done it, and I think it's shortsighted on their part. From my perspective, I don't really care. For me it's more important to get the message out than whether I have a developer account.
Do you believe you can get the account back?
I don't know how [laughter]. Maybe, but I don't have any idea how to make that happen.
Maybe some nice stationary and an apology letter?
Like I said, I don't think I have anything to apologize for. Given the choice I'd do it again, because I think letting people know about the risk is more important than the $99 account I had.
At this point, what are you overall feelings about Apple, their security and their overall product line?
They've made a ton of progress. If you look back at the iPhone when it first came out in 2007 it was a disaster, security wise. The web browser ran as root, and that's about as bad as it gets right there. There's no sandboxing, there's no code signing, nothing. So if you compare that with the iPhone now, which I say quite openly, that it's a very secure device. As far as Apple itself, I don't get much insight into what they do and the people that are there, sort of a closed community. I know they've been hiring some pretty smart people and that's been paying off. Unfortunately, this latest incident has sort of put a damper on my outlook for them, but hopefully it will work itself out.
What would you tell someone looking to follow in your shoes as a security analyst, what would you say to them?
I think the best thing to do is jump in, get your hands dirty and try and learn that way.
I get this question a lot and unfortunately it's hard to follow my exact footprints, because not everyone can get a job at the NSA and have them train you. Beyond that, probably, my advice I give out to a lot of people is to read some books; there are ones that I recommend. And then, most importantly, just jump in and find the programs that interest you, and tear it apart, see how it works. See if you can find a flaw. Look at exploits.
Download the program that I wrote. Figure out how it works, what was the flaw, why was it there, how did it work? Computer security is a really hands-on activity, and that's why I'm sure it's not properly taught at a university. It's more of a craft, and you have to have an apprenticeship and I had that at NSA in their three year program, where I did nothing but learn and then I was ready to go. I think the best thing to do is jump in, get your hands dirty and try and learn that way.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.