Moore’s Law turns 50

Guruswamy Ganesh | Updated on: Jan 23, 2018


And it continues to drive the innovation we take for granted every day — from smartphones to internet to big data

On April 19, one of the most popular phenomena in the semiconductor industry — Moore’s Law — turned 50. It’s not wrong to call it a phenomenon, a word most often quoted, and misquoted, to explain and predict the way forward for the chip industry.

It all started with the observation by Gordon E Moore on the number of transistors in an integrated circuit doubling approximately every 18 months.

The law has had several iterations as the timeline for doubling the transistor count kept changing. From 1965 to 1975, the transistor count on a chip doubled every year. In 1975, it went to two years, and back in the late 90s, the storage industry accelerated it again.

However, since 2008, the pace slowed down, leading to doubts being voiced about the possibility of this law remaining relevant in the future.

Unusual factors

What makes Moore’s Law work is an unusual combination of factors. Shrinking transistors improve performance by reducing signal distances. More high-performing small chips can be plucked out of a wafer of a finite size as against the earlier larger, slower counterparts, which means that the same fixed capital costs can generate more revenue.

Moore’s Law drives the innovation we take for granted every day — from smartphones to the internet to big data. Even after 50 years, it is amazing how this law continues to ring true for CPU and memory performance, allowing them to continue to scale to meet the needs of today’s data-hungry environments. It has been used by the industry to govern roadmaps for form factors and performance for years. People at large enjoy the benefits and convenience of devices shrinking with better performance year after year, without realising the significance of Moore’s Law in action in the background.

Semiconductors are about as cool to the outside world as the vacuum tubes they replaced. If not for guidance such as this, an iPhone would have been the size of a steamer trunk, cellular relay stations would rival the Qutab Minar in size, and a Google data centre would consume as much energy as Mumbai city. And that is the beauty of this concept which is the epitome of simplicity and complexity coming together.

Simply complex

The biggest and most underappreciated achievement of Moore’s Law is that it institutionalised optimism in the industry. It brought predictability to the world of technology, highlighting that despite the high costs and numerous challenges, technology would invariably be a better, safer investment than virtually almost anything else.

The fact remains that investing billions in new fabs, or energy efficient data centres, entails risk, but far less than drilling for oil in the Arctic. The law also brought healthy competition taking the semiconductor industry forward because if any player slowed down, the other would take over.

Let’s take a look at Moore’s law in the context of the storage industry. Over the years, an exponential gap has grown between the advancements of CPU and memory, and what traditional storage can handle.

As CPU, memory and networking all continued to follow Moore’s Law and double performance about every two years, hard drive performance is the laggard with Moore’s Law helping hard drive density but not performance. In fact, in some cases, performance has actually slowed down as capacities grew. It was more than 10 years ago that the first 15K RPM hard drive was introduced into the market, but we have yet to see 30K or even 20K RPM drives.

The reasons for that are the mechanical limitations that prevent hard drives from achieving faster rotational speeds. Information on a hard drive is read from and written by a combination of mechanical parts, the actions for which may take time — and therefore introduce delays.

All about storage

There are several strategies deployed to counter mechanical delays. Server and storage vendors invest heavily in controllers that use processors, DRAM and other techniques to work around the hard drive bottleneck, but those can only help so much. Over time, this difference in performance creates major inefficiencies, requiring re-architecting or rewriting applications for balanced system utilisation. But with flash, the fix is easier, more reliable and more cost-efficient!

Solid-state drives (SSDs) have brought storage back in line with Moore’s Law. For instance, just four short years ago the average SSD achieved around 250MB/s throughput and capacity topped out at about 512GB. Today, enterprise-grade PCIe application accelerators such as those from SanDisk achieve 2.7GB/s data transfer, and reaching up to 6.4TB of capacity in a form factor that fits into the palm of your hand.

Though the initial costs of implementation of SSDs are higher than HDDs, they offer cost savings in the long run for businesses with their lower energy usage and greater productivity with higher input/outputs operations per second (IOPS). One SSD delivers the performance of 100 hard drives. When very little power is needed to operate SSDs, this in turn leads to significantly less heat output generated by the systems.

So, what does the future look like? Researchers have said that we may hit a fundamental physical wall with regard to semiconductor shrinkage by 2021. However, 3D memory will continue the virtuous cycle. Instead of squeezing transistors tighter together, you stack them up and gain (if not increase) the economic/technical benefits. SanDisk’s 3D BiCS memory contains 48 layers, a high water mark for the industry. For consumers and large businesses alike, smart devices and data centre equipment will become more intelligent because far more memory can be squeezed into a smaller footprint.

The writer is the vice-president of SanDisk India

Published on April 29, 2015
This article is closed for comments.
Please Email the Editor

You May Also Like

Recommended for you