“We’re not prepared for the end of Moore’s Law: It has fueled prosperity of the last 50 years. But the end is now in sight.”
MIT Technology Review, February 24, 2020
By David Rotman
“Finding successors to today’s silicon chips will take years of research.If you’re worried about what will replace moore’s Law, it’s time to panic.”
Gordon Moore’s 1965 forecast that the number of components on an integrated circuit would double every year until it reached an astonishing 65,000 by 1975 is the greatest technological prediction of the last half-century. When it proved correct in 1975, he revised what has become known as Moore’s Law to a doubling of transistors on a chip every two years.
Since then, his prediction has defined the trajectory of technology and, in many ways, of progress itself.
Moore’s argument was an economic one. Integrated circuits, with multiple transistors and other electronic devices interconnected with aluminum metal lines on a tiny square of silicon wafer, had been invented a few years earlier by Robert Noyce at Fairchild Semiconductor. Moore, the company’s R&D director, realized, as he wrote in 1965, that with these new integrated circuits, “the cost per component is nearly inversely proportional to the number of components.” It was a beautiful bargain—in theory, the more transistors you added, the cheaper each one got. Moore also saw that there was plenty of room for engineering advances to increase the number of transistors you could affordably and reliably put on a chip.
Soon these cheaper, more powerful chips would become what economists like to call a general purpose technology—one so fundamental that it spawns all sorts of other innovations and advances in multiple industries. A few years ago, leading economists credited the information technology made possible by integrated circuits with a third of US productivity growth since 1974. Almost every technology we care about, from smartphones to cheap laptops to GPS, is a direct reflection of Moore’s prediction. It has also fueled today’s breakthroughs in artificial intelligence and genetic medicine, by giving machine-learning techniques the ability to chew through massive amounts of data to find answers.
But how did a simple prediction, based on extrapolating from a graph of the number of transistors by year—a graph that at the time had only a few data points—come to define a half-century of progress? In part, at least, because the semiconductor industry decided it would.
In 2018, Fuchs and her CMU colleagues Hassan Khan and David Hounshell wrote a paper tracing the history of Moore’s Law and identifying the changes behind today’s lack of the industry and government collaboration that fostered so much progress in earlier decades. They argued that “the splintering of the technology trajectories and the short-term private profitability of many of these new splinters” means we need to greatly boost public investment in finding the next great computer technologies.
Cramming more components onto integrated circuits
With unit cost falling as the number of components per circuit rises, by 1975 economics may dictate squeezing as many as 65,000 components on a single silicon chip.
Published in Electronics, Volume 38, Number 8, April 19, 1965.
By Gordon E. Moore
Director, Research and Development Laboratories, Fairchild Semiconductor division of Fairchild Camera and Instrument Corp.
Read the Full Article (PDF from intel.com) »
April 19, 1965: How Do You Like It? Moore, Moore, Moore
WIRED, April 19, 2010
By Dylan Tweney
1965: Gordon Moore publishes a pithy four-page analysis of the integrated-circuit business, in which he correctly predicts that chip complexity will regularly double for the foreseeable future.
Moore was, at the time, the chief of research and development for Fairchild Semiconductor, a seminal Silicon Valley startup. He later went on to co-found Intel. His prediction turned out to be basically correct, and it became a rallying cry for the emerging computer industry.
By 1970 people were referring to it as “Moore’s Law.” It quickly became shorthand for the inexorable upward march of computing capabilities – a kind of electronic Manifest Destiny.