How Apple’s Monster M1 Ultra Chip Keeps Moore’s Law Alive

Apple M1 chip on a black background - Courtesy of Apple

How Apple’s Monster M1 Ultra Chip Keeps Moore’s Law Alive
WIRED, April 11, 2022
Business
By Will Knight

“By combining two processors into one, the company has squeezed a surprising amount of performance out of silicon.”

 

For practical purposes, the M1 Ultra acts like a single, impossibly large slice of silicon that does it all. Apple’s most powerful chip to date has 114 billion transistors packed into over a hundred processing cores dedicated to logic, graphics, and artificial intelligence, all of it connected to 128 gigabytes of shared memory. But the M1 Ultra is in fact a Frankenstein’s monster, consisting of two identical M1 Max chips bolted together using a silicon interface that serves as a bridge. This clever design makes it seem as if the conjoined chips are in fact just one larger whole.

 

As it becomes more difficult to shrink transistors in size, and impractical to make individual chips much bigger, chipmakers are beginning to stitch components together to boost processing power. The Lego-like approach is a key way the computer industry aims to progress. And Apple’s M1 Ultra shows that new techniques can produce big leaps in performance.

 

“This technology showed up at just the right time,” says Tim Millet, vice president of hardware technologies at Apple. “In a sense, it is about Moore’s law,” he adds, in reference to the decades-old axiom, named after the Intel cofounder Gordon Moore, that chip performance—measured by the number of transistors on a chip—doubles every 18 months.

 

It is no secret that Moore’s law, which has driven progress in the computer industry and the economy for decades, no longer holds true. Some extremely complex and costly engineering tricks promise to help shrink the size of components etched into silicon chips further, but engineers are reaching the physical limits of how small these components, which have features measured in billionths of a meter, can practically be. Even if Moore’s law is outdated, computer chips are more important—and ubiquitous—than ever. Cutting-edge silicon is crucial to technologies such as AI and 5G, and supply chain disruptions triggered by the pandemic have highlighted how vital semiconductors now are to industries such as automaking.

 

As each new generation of silicon takes a smaller step forward, a growing number of companies have turned to designing their own chips for performance gains. Apple has used custom silicon for its iPhones and iPads since 2010—then, in 2020, it announced that it would design its own chips for Macs and MacBooks, moving away from Intel’s products. Apple leveraged the work it did on smartphone chips to develop its desktop ones, which use the same architecture, licensed from the British company ARM. By crafting its own silicon, and by integrating functions that might normally be performed by separate chips into one system-on-a-chip, Apple has control over the entirety of a product, and it can customize software and hardware together. That level of control is key.

Read the Full Article »

About the Author:

Will Knight is a senior writer for WIRED, covering artificial intelligence. He was previously a senior editor at MIT Technology Review, where he wrote about fundamental advances in AI and China’s AI boom. Before that, he was an editor and writer at New Scientist. He studied anthropology and journalism in the UK before turning his attention to machines.