Apple's five best moments of 2020
Image: Apple

Commentary: ARM has been growing for some time, but it may have just hit an inflection point.

Image: Apple

Must-read developer content

My computing life has come full circle. In 2000, I went to work for an embedded Linux company, Lineo, and though my desktop (remember those?) ran x86, everything Lineo sold dealt with MIPS, RISC-based chips like Intel’s i960, and…ARM. For decades, many of us forgot about ARM thanks to the seemingly insurmountable rise of x86, though ARM remained highly relevant in mobile devices and elsewhere. But most recently, it’s x86 that is looking vulnerable.

Apple may have done the most to make ARM relatively relevant in popular culture with its new ARM-based M1 processor, but relatively few people will ever own an ARM-based Mac. Virtually everyone, by contrast, will use an ARM-based mobile device or interact with web services powered by applications running ARM-based compute instances on AWS or Microsoft Azure (announced) or Google Cloud (Google has been reportedly been working on ARM-based designs for years).

So is it an ARM world now? The obvious answer is “yes.”

It’s ARM all the way down

Whether you’re running apps on your phone or the world’s fastest supercomputer, you’re most likely running ARM. Given recent events, that trend toward “more” just might kick into overdrive. ARM Limited, which for years has licensed its architecture for others to build chips, has always had plenty of friends. But with Nvidia’s $40 billion deal to acquire ARM Limited, ARM just got an aggressive, expansive buyer in Nvidia.

Nvidia has spent years expanding the market for its GPUs (graphics processing units) into general purpose apps that have found ready buyers in ML/AI, high performance computing (HPC), and more. Now it’s acquiring ARM Limited right at the time that “the near future is all about vertically integrated [system-on-chip] ARM designs like the m1,” as PhoneGap cofounder Dave Johnson has highlighted.

It’s perfect timing but, according to Apache Software Foundation member Justin Erenkrantz, ARM’s rise has “been inevitable for close to a decade now.”

How so? Well, as the world becomes more mobile, it makes sense that chips designed from the start for stellar mobile performance would be winners. While x86 still wins on raw power, that’s not necessarily what buyers (particularly in phones, laptops, etc.) are looking for. ARM-based silicon delivers better battery life, runs cooler, and is starting to reach x86 speeds (or exceed them, as the AWS launch of Graviton2 EC2 instances suggests). They’re also cheaper to manufacture.

All of which promises to make life unpleasant for the x86 incumbents. Except for…developers.

My PC, my cloud?

While there is clearly demand for ARM running in the cloud, Linux creator Linus Torvalds recently knocked back the idea that ARM would take over simply because it’s cheaper/faster/whatever. The key to ARM dominating in the cloud (and elsewhere) may come down to how prevalent it becomes on the machines developers use to build their apps.

As Torvalds told Steven J. Vaughan-Nichols in an email interview, “my argument wasn’t that ‘ARM cannot make it in the server space’ like some people seem to have read it. My argument was that ‘in order for ARM to make it in the server space, I think they need to have development machines.'”

This makes sense, and though comparatively few developers will be running Apple’s M1 processor anytime soon, most applications don’t run on laptops anymore–they run on mobile devices (smartphones, tablets), nearly all of which already run on ARM. Even those applications optimized for laptops (and beyond) benefit more from ARM’s focus on customizability. For example, Apple can tweak ARM for ML-centric applications in a way that it simply can’t with Intel’s x86. This turns out to be a trump card.

Nothing changes overnight. Will we see x86 deployed for the foreseeable future? Of course we will. But this “little mobile chip architecture” will play an increasingly central role in computing over the next decade. Fast forward to 2030, and it’s very, very likely that the entire computing landscape will look completely different.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here