Feeds

Get ready to buy chips by the kilo

So begins the era of 'Unbound Compute'

  • alert
  • submit to reddit

HP ProLiant Gen8: Integrated lifecycle automation

Industry Comment We’ve spent 20 years assuming that we add memory and disk in large numbers and CPUs in small numbers. What if all three scaled in the same way? Now, that would be a game changing innovation, one that would spawn a new age for business applications and raise the bar on IT productivity and business efficiency.

Remember way back when PCs had a grand sum of 64 kilo bytes of memory? These days, we count the memory in small laptops in hundreds of mega bytes and the memory in big servers in fractions of tera bytes. The same thing happened to disk space: mega bytes to peta bytes. What’s next?  exa, zeta, and yotta. 

But when it comes to CPUs, we still mostly dabble in single digits. An 8-way server feels like a pretty large system. The 32-way, 64-way, and 200-way systems feel just huge.  Even when we scale out, anything beyond a couple of hundred CPUs begins to challenge our ability to manage and operate the systems. It’s no accident that they call these systems a “complex.”

A major shift is coming. Over the next few years, your ordinary applications will be able to tap into systems with, say, 7,000 CPUs, 50 tera bytes of memory, and 20 peta bytes of storage. In 2005, Azul Systems will ship compute pools with as many as 1,200 CPUs per a single standard rack (1.2 kilo cores! - I like the sound of that!)

What would change about application design if you could do this? Well, think back to what applications were like when you had just 128K of memory in your PC and a 512KB hard drive. The difference between the capabilities and flexibility of applications in those days and now is the level of improvement that we are talking about.

Photo of Shahin Khan, CMO at Azul Systems

If you could count CPUs the same way that you count memory, some problems would simply become uninteresting and others would transform in a qualitative way. And completely new possibilities would emerge.

Deployment and administration of applications would also change dramatically. Do you ever worry about how much storage an individual user might need?  Probably not. You just install a NAS device with a tera byte of storage and let everyone share it. This approach works because no single user is likely to fill it up quickly, and you can plan storage capacity across all your users rather than each individual one. Do you ever worry about the utilization level of an individual byte of memory? I hope not. You have so many bytes that you measure utilization at the aggregate level.

If you had hundreds of CPUs in a miniaturized “big-iron” system that were available to your applications, you could adopt the same strategy for applications. No need to plan capacity for each individual application. Let all of your users share a huge compute pool and plan capacity across many applications. In the process, you also fundamentally change the economics of computing. Well, that’s exactly what Azul Systems is pioneering.

This is a whole new way of looking at the CPU, and therefore, the function of “compute.” This approach is gaining mainstream acceptance. The industry has reached 2 or 4 CPUs on a chip for large symmetric multiprocessing (SMP) systems; and for systems limited to one chip, tens of functional units in one CPU. Some companies have announced future chips with as many as 8 CPUs on a single chip. With 24 CPUs on a chip that can be used in an SMP system, Azul has already set the bar much higher. And that’s just the beginning!

Get ready for an era when you can order CPUs by the thousands. And get ready for the new language of that era: Do we say: 2.5 kilo CPUs? Do we call this kilo core, or mega core processing? And since it goes way past current multi-core technology, do we call it poly-core technology?

Here is a possible headline in 2005:

Poly-core Technology to Enable Kilo Core Processing. Happy Apps Hail Freedom!!

Happy 2005! ®

Azul Systems has created one of the most radical processor designs to date. Its Vega processor sits at the heart of a Java crunching server due out in the first half of this year. More information on the company's upcoming products can be found here.

Reducing security risks from open source software

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
Amazon Reveals One Weird Trick: A Loss On Almost $20bn In Sales
Investors really hate it: Share price plunge as growth SLOWS in key AWS division
US judge: YES, cops or feds so can slurp an ENTIRE Gmail account
Crooks don't have folders labelled 'drug records', opines NY beak
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
prev story

Whitepapers

Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
Seven Steps to Software Security
Seven practical steps you can begin to take today to secure your applications and prevent the damages a successful cyber-attack can cause.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.