Feeds

IBM turns back on server history

To give and to hybrid

SANS - Survey on application security programs

As odd as this may seem, IBM is not thinking about servers any more. Well, not in the way you might think.

According to the top minds at Big Blue's Systems and Technology Group - which designs and sells its servers, processors, and storage - the future is not about making particular server architectures do jobs and fight for market share against other servers. That's so 1980s and 1990s. The future, it seems, will be about creating hybrid systems comprised of different architectures, mixing multiple compute, storage, and networking elements together and tuning them for very specific jobs. Sometimes, I would guess, on the same processor complex, sometimes in loosely coupled systems.

IT vendors always have a theme, because they have to tell budget-time stories to CEOs and presidents who don't necessarily know a lot about computers other than how expensive and cranky they can be. (Actually, that's the system administrators, but that is a different story...)

For a long time, IBM banged on the On Demand drum, with a pretty tight focus on making IT more flexible, more like a utility in that you turn it on and off as you need it and only pay for what you use. These days, IBM is aiming for a lot broader a market than On Demand with something called Dynamic Infrastructure. Having pretty much worn out the On Demand song and dance and wanting to get its fingers into a whole lot more pies, IBM's Dynamic Infrastructure is about instrumenting and automating all kinds of infrastructure, not just computing.

As a consequence, Tom Bradicich, vice president of technology for the x64 server business within System and Technology Group at IBM, has to lead in with talk about how 2 billion people will be connected to the Internet by 2011 with trillions of objects - cars, trucks, tractors, roads, bridges, pipelines, and electric grids as well as people and their myriad devices - all linked in too. "The world might be getting smaller and flatter, but we believe that is has to get smarter, too," says Bradicich.

And it ain't exactly stupid right now for IBM to position itself to get a bigger piece of the government action, as it has most certainly been doing in the lead up and the passing of the Obama administration's stimulus plan. But Dynamic Infrastructure is about more than IBM getting into more government budgets to help meter and automate various kinds of physical infrastructure with computing and networking technology. It is about wringing efficiencies out all of these different infrastructure systems, to cut down on waste because it is increasingly clear that we can't afford - either environmentally or economically - to waste anything any longer.

So what does this have to do with x64 servers? On the surface, not much. Not until you look at the scale of computing that IBM thinks is necessary to create this smart infrastructure world. Bradicich, who spearheaded the design of IBM's EXA family of chipsets for its high-end x86 and x64 servers for many years, says that commodity x86 and x64 systems have increased their performance by a factor of two every two years or so and that over the course of the next ten years, the normal way of doing things - shrinking chips, cranking clocks, adding cores, and adding features that used to be out on motherboards - will deliver a 30 times improvement in commodity system performance. That sounds like a lot, but apparently it isn't.

"We believe we will need a 100 to 1,000 times improvement in performance to solve problems, such as doing a full body CT scan in real-time, or fast rendering of movies, or modeling traffic patterns on a city scale, just to name a few," says Bradicich.

"And that means the server of the future is not a machine with just faster memory or better packaging. Integrating switches and other features of the network will not get us beyond the 2X performance improvement per year. I mean, it is possible to play Handel's Messiah with 100 accordions or 100 trumpets, but to really get the full effect, you need an orchestra. In our experience, it has never been wise to say that one size fits all."

3 Big data security analytics techniques

Next page: Hybrid Model

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
IBM rides nightmarish hardware landscape on OpenPOWER Consortium raft
Google mulls 'third-generation of warehouse-scale computing' on Big Blue's open chips
It's GOOD to get RAIN on your upgrade parade: Crucial M550 1TB SSD
Performance tweaks and power savings – what's not to like?
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
prev story

Whitepapers

Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.