Why is the Apple M1 Chip so successful?
Joe Jia
Ever since the release of quote-on-quote “Apple silicon”, many have raised the important question of why this specific SOC(system-on-chip) is such a success. There are many factors that contribute to the development of this chip, in short, it’s because of Apple’s decision-making in the 2010s and its continual usage of in-house developed chips on their mobile lineup.
First off, I think a major aspect of why this chip performs so well would be the hardware architecture ARM offers and the software powerhouse of Apple’s in-house design team. Starting off with the hardware architecture, ARM. ARM as an architecture is more of a way of thinking, originally it stemmed from the RISC architecture with the idea of instruction set simplicity leading to performance gains and extra efficiency. ARM means “Acorn RISC Machine”, Acorn is the original company that developed ARM and RISC is the hardware “philosophy” of sorts the architecture is mainly based on.
This underlying philosophy is what makes ARM the company it is now. ARM as a company mainly just licenses base designs and architecture code to other companies which then adapt the design towards what each device requires. They’ve gotten pretty good at this, the major reason why it’s important would be the insane power efficiency ARM’s design philosophy and instruction set offers companies. For mobile computing (Phones, tablets…) ARM is a superior platform to rather complex and clunky desktop computing platforms like X86 (What Intel and AMD use in its desktop and laptop chips). Apple was undergoing its transformation from PowerPC to Intel for its desktop computers and laptops back in the 2000s and approached intel for its upcoming iPhone for a custom mobile chip. Intel said no, citing concerns that the iPhone won’t sell well and being confused on what such a device would be used for. We all know how the iPhone did, this mistake on Intel’s part would prove to become one of the worst decisions the company has made. Apple had been using ARM designs for other very well-selling products, a great example would be the iPod. Apple observed the power efficiency of the architecture offered back in the Newton and iPod days, since they had experience working with both the architecture and the company they chose ARM as the architecture for the iPhone. Apple continued their own in-house chip designs based on the ARM architecture and began making huge leaps forward in performance and efficiency on their phones. This experience with chip design and the benefits of ARM for desktop systems if, and only if, software supported it made Apple decide to design and release the M1 chips. A crazy statistic is that the newest and greatest Apple in-house chip based on ARM for their newest flagship phones perform better than an i9 10900k in some benchmarks. And this chip barely even sips power, spitting out insane performance per watt points. Seeing this, Apple questioned whether something could be adapted for the desktop/laptop market as well. Turns out, there is a very good reason why everyone else didn’t use ARM for these applications. Since Intel made x86, they (and AMD) have cornered this market and nearly all desktop apps require this instruction set. So for the newly made smartphone market, this was no biggie, since they had no apps they didn’t rely on this architecture. But Apple’s in-house software and OS team is no joke, they have quite the OS for mac and can easily adapt it towards ARM-based chips. The first M1 chips run native apps efficiently and extremely fast, and it copes with emulated apps as well since Apple wrote “Rosetta” to translate x86 applications to the ARM architecture. This has proven to be a huge win for Apple. By providing both software and hardware (plus some serious cash) Apple has successfully proven ARM’s ability to become a great laptop and desktop chip. Apple’s huge app market creates an incentive to develop apps for upcoming ARM-based Apple hardware making the native app library even larger and increasing incentive for Apple to make even better chips. By throwing everything it’s got at the M1 chip Apple has found new and exciting frontiers and a great incentive to continue designing and making its own chips.
Another huge incentive for Apple to make the M1 is Intel’s terrible track record of making mobile chips, as mentioned above the x86 architecture sucks for mobile applications and wrecks havoc on performance per watt statistics. Intel was on a wildly different track when Apple first went into designing its own chips for desktop usage. Its newest and “greatest” chips were slow, clunky, and running hot. Running at 115 Watts for a desktop is acceptable but for Apple’s impossibly thin laptops the puny heatsinks could not handle it. Turns out, people’s laps hated it too! (myself included). Together with the heat and its decision to move towards that impossibly thin design as more of a “Luxury” and “Style” tech company the move for ARM is to be expected. Apple is the only company with the cash and experience to make the M1 possible, and it is on a rampage with putting them inside desktop products with rumors predicting an M2 “Max” to field nearly 40 cores and go inside their absolute top of the line Mac Pros.
Speaking upon Apple’s focus on design, that would also be another factor that contributed to their decision to move towards ARM and eventually design the M1 Chips. Design has always been integral to Apple even in the early days of the Apple 2, Steve Jobs had an obsession with making everything beautiful and later Apple products had an obsession with crazy thinness. These Macs were beautiful, starting out with the iMac G4 their products have always been simplistic yet elegant and contributed a lot to modern-day design. There was a problem, however, the obsession with design and thinness means Apple had to make the heatsinks absurdly tiny which limited compute power. It really isn’t that good of a computer if you can’t compute on it. Instead of overhauling much of their design on Macs, Apple made a tradeoff and went with a completely new processor. At the start, it certainly was not a financially responsible decision to go into the chip design market for desktop computers but Apple quickly realized that its ability to make its own operating systems and provide incentives for developers gave them a huge advantage in the market. It also realized a plus side of using ARM for both phones and laptops, the laptops could now run phone or tablet applications natively which gave the early M1 Macs even greater incentive for developers. The new M1 Macbook Air doesn’t even have an active heatsink, relying on passive cooling for its needs. This article is being written currently on an M1-based Macbook Air and it really doesn’t get warm at all. Even while using Xplane 11 or playing pretty demanding games it never got hot enough to burn me or give me diminished performance. It is crazy and absurd that Apple pulled it off and I really enjoy using Apple Silicon, it has been a great deal faster than any laptop I have ever owned and it has provided me with amazing battery life and computing efficiency.
Now, apart from the decision-making, there’s also a lot of bleeding-edge tech inside this chip that makes it so efficient and fast. The process node the M1 is manufactured on is absolute cutting edge, utilizing TSMC’s 5-nanometer EUV node they were able to make this already incredibly efficient and powerful chip into an even more powerful and efficient CPU. If you don’t understand the concept of smaller and smaller nanometers, it essentially dictates how small a transistor could be. In an ideal world the smaller the transistor the more efficient the chip is. Or you could pull an intel and stay on the 14nm node for a decade and just add a few pluses on the end of it… Recent advances in advanced chip production resulted in the ASML EUV machine, this has been in development since the 1980s and is a huge accomplishment for chip makers trying to push the limit of Moore’s law. 3nm is the current most advanced node on this machine albeit at a pretty penny. Just to put into perspective how advanced the M1 is manufacturing-wise, just a few years ago this chip won’t be possible at 5nm. Advanced EUV machines are so new that these were pretty much impossible to make at volume until TSMC was able to acquire enough machines to produce 5nm chips at enough of a volume that they became cheap enough to justify using.
So after all this talk what is the end result? From personal experience the base model MacBook air it has been an amazing experience. The base model has one fewer graphics core and has the same amount of CPU cores. Gaming wise it does fine, this is certainly not a gaming machine or a graphics performance monster. The end game is that this is an ultrabook with some crazy power packed into an incredibly thin and efficient package. Its great for all the light tasks like watching videos, editing documents… but where it truly shines is productivity applications. Apple products have always been relatively good in productivity, they are usually aimed at “pro” creators. This, however, marks the first time in which an Apple laptop has been so powerful that out of all the desktops and laptops out there aimed at creators they still choose the M1. I know quite a few people who have switched their wildly powerful and expensive desktops for the subtle and quiet M1 Mac Mini. It’s crazy how fast native applications run on the M1 and its even crazier how well “Rosetta” translation from x86 to ARM is. Sure it isn’t native level performance but it sure is a huge improvement from the leg burning Macbook Air’s of yesteryear. The model I have is a base model Macbook Air, the fact that this thing only has passive cooling is crazy. I get way more than enough battery for an entire day of educational uses. Gaming wise this thing just chugs through Civ 5. It is such a great machine and knowing that Apple is planning to possibly release the next generation M2 chips on March 8th I am extremely excited for what they have.
Thanks for reading, there will be another article on ASML and the semiconductor industry coming soon, might be a bit complex but it sure is fun to comb through all this cutting edge tech!