Feeds

Big Blue bigwig: Tiny processor knobs can't shrink forever

You cannae break the laws of physics - and 7nm is the limit

Remote control for virtualized desktops

HPC blog While at IBM’s Smarter Computing Summit last week, I had the great pleasure of hearing Big Blue's Bernie Meyerson talk about limits to today’s tech, and the associated implications.

Bernie is IBM’s VP of Innovation and one of the rare technologist-scientist types who can clearly and directly explain highly technical concepts in a way that they can be understood by a reasonably intelligent grey squirrel (and me too).

Even better, he’s highly entertaining and doesn’t hedge when it comes to stating what’s what in the world. Back in 2003 he predicted that Intel would never deliver on its promises of 4 to 5GHz CPUs and would, in fact, be forced to shift to multi-core processors.

Meyerson backed up his brash prediction (it was plenty brash back then) by sharing electron microscope images of individual atoms that showed they’re kind of lumpy. The problem with lumpy atoms is that when you use only a handful of them to build gates, they leak current like a sieve. When asked about this, Intel denied over and over that there was a problem – right up to the point when it announced it was scrapping its entire product strategy in favour of a multi-core approach.

So when Meyerson talks, I pay attention. And Meyerson is talking again.

In his presentation at the Pinehurst golf resort in North Carolina, he was again playing on the theme that we can’t shrink our way to higher performance any more. In fact, when it comes to chips, we have only a generation or two left before we reach the end of the line.

So where’s the end of the line? According to Bernie: 7 to 9 nanometers. When the features on a chip get to this minute size, you start to see quantum mechanics effects that are “very nasty” that impairs the performance of the processor's decision-making gates.

The problems at 7nm are profound to the point where there isn’t really any way around them – it’s just too damned small – and there isn’t a way to scale down an atom. It’s a fundamental limit, and it’s finally in sight. Chips in mass production these days have a 32nm or 22nm feature size, and 14nm is not far down the line.

Unfortunately, I can’t toss around the correct scientific terms to pretend I know what I’m talking about here. I have only my own deplorable notes for reference; plus Meyerson’s time slot forced him to move pretty quickly through his material. But he was probably talking about quantum tunneling, a phenomenon where particles (such as electrons) travel through barriers, like those in very thin semiconductor gates, that they should not cross. This results in lots of current leaking from these tiny switches, relatively speaking, which ramps the device's power consumption.

Meyerson also talked about the limitations facing us on the storage side. Like most great stories (and many great ideas too), it starts in a bar. In this case, it was Bernie in a bar with a bunch of other smart guys, probably knocking back drinks that aren’t accessorised with little umbrellas. Like all barroom conversations, the topic eventually turned to magnetic storage density. More specifically: how many atoms would you need to reliably store a single bit of data?

This prompted some non-barroom research and scientific activity. The resulting answer? Twelve. It takes twelve atoms to reliably store a bit of data. Any less and you lose stability, meaning that parts of the data might disappear, or morph into something you didn’t store. This is related to the same quantum effects discussed above and are ultimately the result of the fact that we can’t scale atoms down to a handier size.

From what Meyerson said, it sounds like we have a bit more room before we start to run up against the limit on storage density. If my notes are correct, we won’t approach the 12-atom limit until we get around 100 times more dense. Right now, a 1TB per platter is the highest density available. Theoretically, we may be able to get to 100TB per platter and 300TB per drive at maximum density.

So how long do we have until we hit the limit? It depends on how fast density grows. Historically, we’ve seen density grow anywhere between 20 per cent and 100 per cent per year. Lately (last decade or so), growth has ranged between 20 per cent and 40 per cent annually, meaning that we might hit the twelve atom limit in as few as 13 years or as long as 25 years.

That’s an eternity in the tech business – maybe even long enough for someone to figure out how to shrink atoms down to a more convenient size. ®

Intelligent flash storage arrays

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Go beyond APM with real-time IT operations analytics
How IT operations teams can harness the wealth of wire data already flowing through their environment for real-time operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.