I can’t wait for laptops with Apple’s own chips

Apple making its own in-house chips for the Mac makes more sense than ever.

Updated ·6 min read

Apple might be ready to ditch Intel's x86 chips in the Mac in favor of a custom-designed piece of silicon. At least that's the story out of Bloomberg, which believes that a transition by Apple to its own CPUs could begin by 2020. It's just a single, as yet unsubstantiated story, but it's already made a dent in Intel's share price, even if Apple is hardly its biggest customer. And yet it's clear that between Intel's recent problems and Apple's successes, it's time that divorce proceedings begin.

The company has been down this road before. In 2005, it decided that PowerPC's failure to maintain performance parity with Intel was enough. Despite previously bragging that existing PowerPC chips were faster, Apple had to transition its operating system and apps over to Intel's x86 architecture.

It's clear that Apple and any third-party chip supplier will have a relationship that would be politely described as problematic. After all, Apple prefers to control every part of its business, including every component that is in its hardware. You can see this in the iPhone, iPad and Apple Watch, which use custom internals and, because of that, rarely miss production deadlines. The Mac, meanwhile, has suffered from stop-start refresh cycles and component troubles, and much of the blame for that has been laid at Intel's feet.

Between the cost of Intel's hardware and the recent Spectre and Meltdown security flaws, there are plenty of reasons to take the business elsewhere. There's also the fact that Apple's in-house chip team is designing silicon that can, at least on paper, stand equal to the best Intel can offer. Geekbench tests of the iPhone X's A11 chip found that it was almost a match for the 2017 MacBook Pro. If Apple can beat an Intel chip in its smartphones without trying, imagine what it'll achieve when it bothers.

The A11 was also the first chip to ship with Apple's three-core custom GPU, which it designed once it had ended its relationship with Imagination Technologies. The British designer had worked with Apple on its PowerVR graphics chips since the first-generation iPhone. But Apple decided to part ways with the company in early 2017.

Apple would release its first-generation graphics chip on the next iPhone a few months later, to much surprise from the industry. The silicon was able to squeeze 30 percent more performance and use half as much power than the previous year's PowerVR. Apple clearly has the skills, knowledge and expertise required to build its own desktop chips, should it want to.

In fact, Apple's chip team is already producing Mac chips, just not the CPU. The small T2 engine inside the new MacBooks controls the machine's bootloader and FaceTime Camera and protects your personal data. These functions were previously controlled by the CPU, and you can bet that, even if Intel remains inside Apple's machines, its responsibilities will shrink.

I imagine that we won't see these chips pop up in the Mac Pro any time soon, or even the higher-end MacBook Pros. After all, those machines are designed to appeal to professional users who won't want to sacrifice their existing software setups. The MacBook, however, seems like an ideal candidate for Apple's first custom CPU.

The suffix-less machine is designed to be ultraportable, with a tiny footprint and a superlight chassis. If an existing A-series chip can already smoke a Core i5 CPU, then I would not be surprised to see a future iteration beat the Core m3 you currently find in the base model. And since the emphasis here isn't on crunching heavy pro applications, users should see huge gains in battery life.

I've spent the better part of six months using an iPad Pro as one of my "travel" machines, often running Slack and Pages side by side. Doing this job, it's the sort of work that Apple's lower-end machines are designed to do without breaking a sweat. And the one thing that would stop me from adopting an iPad Pro as my main travel machine is that I like how messy OS X can be.

Because I stack browser windows and documents on top of each other like a deskful of papers, iOS's multitasking rigidity can feel constrictive. But if Apple can run OS X, or even a slimmed-down version of its operating system, on a mobile device only slightly larger than an iPad, then I'm all in. Imagine if the MacBook packed the same 41.4-watt-hour battery as the company's tablet, but with the more energy-efficient chip, too.

If there's a downside, it's that nobody's particularly looking forward to yet another messy transition. Those of us who hold off on getting a new MacBook do so because we're not yet ready to buy all-new USB-C accessories, for instance. Or there are the folks who are still feeling sore that their expensive headphones won't work on the new iPhone without an adapter. With new in-house components, there could be a couple of years during which your particular application may not work as well as it could, or should.

It's likely that you'll also have to kiss Windows compatibility goodbye, at least in the very short term. It's worth remembering that Microsoft, too, is dipping a toe into using mobile chips to run desktop software, and the ASUS NovaGo runs Windows 10 on a Snapdragon 835. Admittedly, it does so with a boatload of caveats, including no support for 64-bit software, but it's still early days. And at least Apple has already pushed through its 64-bit transition without too much grousing.

That may be the thing that's the most exciting, since Apple will have learned plenty of lessons from its PowerPC transition. At the time, XCode was able to compile builds of software for Intel's chips, and pretty easily too. If the anecdotal evidence is true, then perhaps the change won't be as painful as we might think. Not to mention that Apple has learned plenty since then, and is also doing its best to bring iOS and Mac OS together.

The only limit to this is if Apple can take what it has learned from the iPhone and iPad to hit the ground running in computers. But given how quickly it has already managed to become a capable designer of mobile chips, I wouldn't bet against them.