Feeds

Intel: 'All your clouds are us inside'

Xeon and Atom in the eye of the storm

Boost IT visibility and business value

Intel is getting used to being the big chip on the data center campus, and it is not about to let upstart vendors peddling other chips (that means you, Advanced Micro Devices) or architectures (that means you, ARM Holdings and friends) move in on its server turf. Not without a serious fight, at least, and certainly not in the cloudy infrastructure portion of the server racket that is exploding.

Cloud computing may not be a completely a new way of moving bits and bytes around to do work, but just the same, incumbents can get pushed down to the far end of the data center feed trough as has been the case with prior data processing and information technology transitions. That goes as much for those who supply systems, storage arrays, operating systems as it does for those who make their sub-components, like processors and chipsets.

Intel has been vague about exactly what its cloud strategy is, and understandably so. You have to make a pretty big leap from Xeon and Atom processors and their chipsets to a cloud. There are layers of hardware and software and many partners who turn chips into clouds, and they are the name brands on the cloud fronts where the IT weather is happening, changing.

Still, Intel can't afford to just let server makers and cloudy software tool providers do their thing and hope for the best. It has to nudge, encourage, cajole, listen, and react to a wide variety of partners and competitors and do all that it can to make sure that no matter what happens, its chips are at the heart of the clouds. Or perhaps the calm in the eye of the storm. Pick your weather metaphor and amuse yourself.

That job falls largely to Jason Waxman, general manager of high density computing in Intel's Data Center Group. The cloudy parts of Intel are spread out across the company's several Beaverton, Oregon, campuses, which is also where Intel does a substantial amount of research into server designs, custom server manufacturing, chip research, and wafer baking on both research and production scales.

Last week, Intel hosted an event called "A Day in the Clouds" for members of the press, and El Reg sat in on briefings with Intel's top brass in the cloud organization and did a tour of the labs behind its Cloud Builder program, which puffs up reference architectures of hardware and software for a multitude of cloud computing scenarios. This was the first time that Intel put some numbers on the cloud phenomenon and articulated its role in helping IT customers transform their brittle machinery and software into something a little more manageable and a lot more like the dream of virtualized, utility computing that has been in development since the issues with cheap, distributed computing became apparent in the wake of the dot-com bust.

With x64-based machines accounting for more 97.4 per cent of the 2.38 million server shipments in the most recent quarter (according to Gartner) and with Intel having around a 93 per cent share of those shipments, giving the company somewhere north of 90 per cent of overall server shipments, you might be thinking that Intel might rest on its laurels. But the company's core philosophy, as espoused by founder and former chairman Andy Grove, is only the paranoid survive, and despite what the company says publicly about how no one is all that interested in super-low-power servers or takes the possibility of ARM-based servers all that seriously, you can bet that there are plenty of people who are paranoid about these and other possibilities inside Intel and that they are working feverishly to make sure the future that Intel wants for the cloud is the one we all move toward.

Intel started up its Cloud Builders program in the fall of 2009, and with the launch of the Open Data Center Alliance last October, the company ramped up the effort and put some resources behind it to help various cloudy tool providers build and test clouds using a mix of their wares and put together those boring old reference architectures that don't make for hot news but which can save IT shops a lot of grief when they go to build their own private clouds and start integrating them with public clouds.

Other Intel execs talked about the Cloud Builders program in detail, and El Reg will get into that separately. Waxman gave a higher-level view, first talking about the growth opportunity that Intel was very keen on not missing. This is now known as Intel's Cloud 2015 Vision, something it touched on a tiny bit back when the ODCA was launched last October.

Waxman said that Intel has been working with cloud software and service providers for the past four years to come up with this vision thing, and threw around some statistics as general managers are apt to do. Waxman said that the Intertubes would be adding 1 billion more "netizens" by 2015, and that four years from now, there would be more than 15 billion connected devices linked to the Internet - four times what we have today. Intel has also extrapolated some data from networking giant Cisco Systems and believes that in 2015, there will be over 1 zettabyte of data moving over the Internet.

That's 1 million petabytes or 1 billion terabytes, depending on how you want to think of it. The data growth is being driven by every richer and more human data formats. In 2010, said Waxman, more data moved over the Intertubes than cumulative moved over the network since it was built through 2009.

"This is a tremendous amount of growth," Waxman said, and added that cloud service revenues are expected to grow at more than a 20 per cent compound annual growth rate between 2009 and 2014. (Those are Gartner figures.) No one said anything about anyone making money on all this traffic - look at how hard it is for Rackspace Hosting and Terremark to make a buck - but all that network traffic will probably make Intel some dough.

Intel in the Clouds

All your clouds are Intel Inside

It looks like system administrators will also be keeping their jobs unless cloudy infrastructure management tools make a quantum leap. Waxman pulled out a statistic from Bain & Company, the consulting firm that is not related to the private equity firm of similar name (Bain Capital), that estimates that between now and 2015, IT organizations worldwide will spend $2 trillion on server, storage, and network deployment unless virtualization of these components improves and tools to manage them scale and get easier to use. Just some modest improvements, says Waxman, can result in about $25 billion in reduced annual IT spending by 2015.

Intel's own prognosticators have sat down and looked at how the cloudy infrastructure market (as distinct from general purpose computing) will play out. Waxman said that Intel estimates that, all things remaining the same, storage use attached to cloudy infrastructure would see a factor of 16X increase between now and 2015, and networking capacity would have to grow by a factor of 8X to keep up with fatter data and more users banging away on clouds to get that data. Under present course and speed, Intel estimates that raw computing capacity will see the largest growth, a factor of 20, according to the Intel marketing wizards.

It's enough to make an Intel shareholder giddy, and maybe even make an AMD shareholder hold out for some hope.

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Object storage bods Exablox: RAID is dead, baby. RAID is dead
Bring your own disks to its object appliances
Nimble's latest mutants GORGE themselves on unlucky forerunners
Crossing Sandy Bridges without stopping for breath
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.