In addition to the myriad technical challenges for Apple switching from x86 to ARM, there are a number of business reasons why this makes no sense.
Apple is fundamentally a system company at heart. From their inception, they have avoided designing chips and emphasized the overall product. Apple’s strength is identifying new markets, and creating products with a compelling user experience and buzz-factor that unlocks the consumer’s desire for new devices and services. Software and industrial engineering play a central role for Apple. They certainly have influenced the direction of their hardware partners (e.g. AltiVec, OpenCL), but chip design is not a core expertise. Early versions of the iPhone were designed by Apple in tandem with Samsung. With PA Semi and Intrinsity, Apple now has some chip expertise in-house. However, those teams have only been at Apple for around 3 years; so they are just beginning to tape out their first chips at Apple. There is relatively little evidence that their teams are prepared for a much larger and more complex project.
Trying to design an ARM microprocessor that matches the performance and efficiency of competing Intel or AMD products is very risky. Even a cutting edge foundry like IBM or Global Foundries is at a significant disadvantage compared to Intel’s 22nm tri-gate process. Assuming that Apple or a partner has the expertise to design such a chip, it is far from clear that their execution will be good enough. The downside risk is pretty substantial; a botched transition could tank sales of their notebooks for two years or more. If Apple’s new MacBooks are slower than the older x86 generation, it would be incredibly embarrassing and nearly impossible for Apple to justify a premium for their products.
Another business consideration is the rest of the ecosystem for notebooks, in particular the GPU. Apple relies on four companies for graphics – AMD, Imagination Technologies, Intel and Nvidia. Imagination is primarily focused on smartphones, tablets and embedded systems where power efficiency and battery life matters more than performance. Aiming for notebook level performance with an unproven CPU and unproven GPU at the same time is sheer madness, and Apple is unlikely to take such a risk. Intel and AMD would have no interest in supplying their graphics for an ARM-based SOC, nor do they have ARM drivers. That only leaves one option though. Nvidia is assuredly developing drivers for ARM and project Denver, and also has some in-house binary translation expertise. But moving to an Nvidia platform would effectively lock Apple into a single source – an arrangement they have tried to avoid for years. Even with x86, there are two options for microprocessors and three for graphics.
An added complication is that Apple would have to split their computer line, between ARM notebooks and x86 desktops (and servers). That would increase their validation and support costs to deal with different hardware and software. While a great deal of software is available electronically, the inevitable confusion and compatibility headaches for consumers would be significant.
The strongest argument for Apple sticking with x86 is that it meets their needs quite well. Intel and AMD are both focused on the notebook and desktop market, and actively pushing towards less power and more efficient products. Apple has good relationships with both companies, and can play them off against one another for their own benefit. Even today, they are mixing and matching Intel microprocessors with integrated graphics that can switch to AMD’s superior discrete GPUs for the best efficiency and performance. Both times that Apple migrated their older platform was lagging in performance (68K and PowerPC) and/or unable to compete in key products (e.g. no PowerPC notebooks). In each case, their business would have suffered substantially if they had not switched. With AMD and Intel hardware to choose from, there is no reason to believe that Apple’s systems are under threat. They have access to world class hardware for their entire product line, and history suggests that as long as that is true, they will stay with x86.
Discuss (95 comments)