It's the most famous law that you've never heard of. Or maybe you have, you just didn't know its name. But whether you have or you haven't, it's important to you. It's the thing that's governed the advance of the technological world for the last half century, giving you your flat screen TV, your smart-phone, your laptop. It's called Moore's law.
Moore's Law can be stated as 'the number of transistors on a microprocessor chip will double every two years or so.' This is what has fuelled the aggressive advancements in computing, since more transistors on chips means faster processors and hence more computations that your computer can execute in parallel. Simply stated, as the number of transistors on a chip gets larger, the speed and computational power of that chip increase.
But Moore's Law isn't an inevitable result of natural forces and processes. It's a selffulfilling prophecy, perhaps the greatest self-fulfilling prophecy in history. It works because the microprocessor industry makes sure that it does, through roadmap released every two years or so. The transistor industry was once like any other, governed solely by market forces, but the complexity of chip making forced the various stakeholders in the manufacturing process to come together in a concerted effort unlike any that global industry has ever seen before. The first meeting, chaired by an Intel executive, was held in 1991 and featured American engineers only. It would later come to be known as the International Technology Roadmap for Semiconductors, a truly worldwide effort with participants from Europe, Japan, Taiwan and South Korea.
The beautiful, elegant machine is grinding to a halt, however. Moore's law has faced two significant hurdles in recent years. The first is unavoidable generation of heat due to packing more and more circuit components into a smaller and smaller space. This overheating problem is what led the industry to limit clock speeds in 2004. Even if it survives that, it will not beat its other opponent: quantum mechanics. To put it in the simplest possible way, as circuit features get smaller and smaller (specifically at around <10nm), the classical laws that govern things on the macroscopic level no longer apply. Welcome to the quirky world of Heisenburg and co, a universe where electrons can jump across an insulator, making all off switches redundant (the cause is a phenomenon known as quantum tunnelling). Transistors built to such a scale would be hopelessly unpredictable and inefficient.
So, what solution is there, if any? Firstly, there's research being carried out into using new materials that generate less heat, which would allow the industry to take the speed governor off. Materials such as 2D graphene (the building block of graphite) are strongly being considered. The second solution is to do away with the current paradigm of digital computers, and switch the focus to neuromorphic computing, a model designed to mimic the arrangement of neurons inside the human brain. The third, and most interesting, is quantum computing. This harnesses the messy quantum mechanics that's behind the whole problem. For certain, very specific types of problems, it would yield an exponential increase in efficiency. So, three solutions then: adapt a little (new materials), adapt a lot (neuromorphic computing) or embrace the enemy (quantum computing). All three hold promise, but most remain unrealized (outside the laboratory that is). It promises to be a brave new world then for those willing and able to seize the moment!
© The Student Enginner UoN All Rights Reserved