What’s After Moore’s Law? The Economist Takes a Look Ahead
A man peeking behind a door from which light is bursting.
May 2, 2016

“‘There’s a law about Moore’s law,’ jokes Peter Lee, a vice-president at Microsoft Research: ‘The number of people predicting the death of Moore’s law doubles every two years.’”

In a recent Technology Quarterly article titled “After Moore’s Law,” The Economist magazine explored the possibilities of what’s next for semiconductor manufacturing once this long-standing industry expectation finally runs out of steam. For those unfamiliar with Moore’s Law, in 1965, Gordon Moore, a founder of Intel, wrote a paper on his observation that the number of electronic components on an integrated circuit doubles approximately every year.

Although the pace of circuit doubling eventually slowed to about once every two years, this observation – later known as Moore’s Law – has largely remained valid:  Intel’s first microprocessor back in 1971 contained 2,300 transistors; today, advanced chips contain billions of transistors. However, as Lee alluded, maintaining this incredible rate of innovation is no small feat. Some device features are now so tiny that they are approaching a fundamental size limitation – the atom – which is why people are once again predicting the end of Moore’s Law.

The article provides an overview of chip electronics and alternative designs and materials, like FinFET and graphene, currently being used to continue scaling. It also explores the potential uses of quantum computing and other novel technologies to prolong device scaling and performance gains (see table below). One thing we know is ahead for the semiconductor industry:  the ongoing need for innovation!

Read more from The Economist through the following links:

After Moore’s Law:  Double, double, toil and trouble – After a glorious 50 years, Moore’s law—which states that computer power doubles every two years at the same cost—is running out of steam. Tim Cross asks what might replace it

More Moore:  The incredible shrinking transistor – New sorts of transistors can eke out a few more iterations of Moore’s law, but they will get increasingly expensive

New designs:  Taking it to another dimension  –How to get more out of existing transistors

Brain scan:  Bruno Michel – IBM’s head of advanced micro-integration reckons biology holds the key to more energy-efficient chips

Quantum computing:  Harnessing weirdness  – Quantum computers could offer a giant leap in speed—but only for certain applications

What comes next:  Horses for courses – The end of Moore’s law will make the computer industry a much more complicated place

The infographic is titled Wait for it.

A pipeline of new technologies to prolong Moore's magic: In 2015 Samsung, Intel and Microsoft alone shelled out $37 billion for R&D. Many of the companies are working on projects to replace the magic of Moore's law. Here are a few promising ideas.

Optical communication: The use of light instead of electricity to communicate between computers, and even within chips. This should cut energy use and boost performance. Hewlett-Packard, Massachusetts Institute of Technology

Better memory technologies: Building new kinds of fast, dense, cheap memory to east one bottleneck in computer performance. Intel, Micron

Quantum-well transistors: The use of quantum phenomena to alter the behavior of electrical-charge carriers in a transistor to boost is performance, enabling extra iterations of Moore's law, increased speed, and lower power consumption. Intel

Developing new chips and new software to automate the writing of code for machines built from clusters of specialized chips. This has proved especially difficult. Soft Machines. 

Approximate computing: Making computers' internal representation of numbers less precise to reduce the numbers of bits per calculation and thus save energy, and allowing computers to make random small mistakes in calculations that cancel each other out over time, which will also save energy. University of Washington, Microsoft.

Neuromorphic computing: Developing devices loosely modeled on the tangled, densley linked bundles of neurons that process information in animal brains. This may cut energy use and prove useful for pattern recognition and other AI-related tasks. IBM, Qualcomm

Carbon nanotube transistors: These rolled-up sheets of graphene promise low power consumption and high speed as graphene does. Unlike graphene, they can also be switched off easily. But they have proved difficult to mass produce. IBM, Stanford University