HPC

This article is more than 1 year old

IBM Workload Optimized: Here comes the hype wave

Can enterprise analytics really be interesting?

IBM pushed out some more of its “Workload Optimized” offerings last week, with the introduction of analytic packages based on its mainframe and x86 systems.

These bundles join the previously announced Power system bundle that it rolled out late last spring. What it's doing here is combining IBM hardware with a full slate of Cognos and InfoSphere Warehouse software into a pre-integrated bundle that will, presumably, allow customers to cut deployment time and get cracking on some serious data crunching.

Looking at the bigger picture, I think a lot of action is going to be centered on the enterprise analytics space in the coming years. It’s not technology pushing it; it’s macroeconomics. It’s just getting harder and harder for businesses to profitably compete. Globalization and the instant communications afforded by the web have made the world a much smaller place, although I’d still hate to have to paint it.

While organizations have been collecting and archiving information for decades now, there aren’t that many who fully analyze it and then use the insights to shape their strategy and tactics. Fewer still combine their data with data from outside the organization to make longer-range predictions. I think this is going to change in the short term.

Competitive pressure plus the ability to gather and analyze data at a reasonable cost will drive companies to see if they can find some golden nuggets of insight at the bottom of a raging data river. (Now that was a truly terrible metaphor; I’ll take it out later.)

We’re not the only folks to have spotted this trend, although I do everything I can to convey that impression. We’re seeing various levels of engagement in the big systems and services vendors, which makes sense; they have the resources and capabilities to put together solutions. But it’s going to take more than just throwing hardware and software in a box to be successful.

Vendors are going to have start at the business side of the organization and work their way back to the data center. They need to demonstrate in-depth knowledge of their clients’ industries and be able to convince them that predictive analytics will more than pay its own way.

On the implementation side, experience in HPC will definitely help, since the techniques used in business analytics are very close to those used in research and science from a computational perspective. By the way, any data-crunching, statistics-loving researchers reading this might want to renovate their resumes in preparation for making a jump to the corporate world – there will be good opportunities for mathletes who can pull actionable insights out of piles of seemingly unrelated data.

On the vendor side, IBM has been the most public with this, and seems to be ahead of the pack. It has been quietly building a predictive analytics story for years now, investing billons to buy Cognos, SPSS, and a host of smaller companies – plus putting together a 4,000-person Global Services unit dedicated to implementing predictive analytics for their clients. Other vendors are moving toward this too, but have yet to put as big a stake in the sand. We know that the other major vendors see the same opportunity and are pursuing it in their own way. There are conflicting messages, of course, which happens in every large organization.

For example, at HP’s recent analyst conference, we found that a lot of its executive thinking is roughly aligned with our own; but when we received a quickie briefing from their folks assigned to push this stuff, we discovered that they seemed to be mostly talking typical BI. We think the addition of EDS to HP has broadened their horizons quite a bit, and perhaps this news has yet to reach the people who briefed us. Oracle’s purchase of Sun has also given them a unique opportunity to engage.

Oracle, of course, knows databases and data manipulation. They also have a lot of deep vertical industry knowledge, and they already occupy a key role in most large datacenters. Combining this with hardware platforms and HPC expertise from Sun would seem to give Oracle a solid position on the starting grid. While it could be argued that Exadata2 is a big throwdown from the big O, we think they need to talk more about how they can help companies use data to make real-time decisions and predict the future.

There are other players who will be a factor in the market as well. This seems like a tailor-made opportunity for Teradata*, which keeps plugging along – still not breaking through the $1bn barrier, but comfortably profitable. SAS will also benefit from the trend, as its products can take customers from basic BI all the way to advanced ‘information-led enterprise’ hood. We’re even seeing some activity from HPC stalwarts like Cray, which recently introduced a small, four-socket server that is aimed at the enterprise market.

Add into the mix plenty of small software startups developing new approaches to traditional data crunching, and we’ll be seeing the humble beginnings of a genuine hype wave. ®

* Correction: Officials from Teradata graciously informed me that the company passed the $1 billion in revenue hurdle years ago. I don’t know what the hell I was looking at, but I’m obviously wrong about their revenue. Their revenue has grown at a modest 4% rate over the past four years, moving to $1.7 billion in ’09 vs. $1.46 billion in 2005. Profits have grown at a slightly faster 4.45% rate during the same period.

More about

TIP US OFF

Send us news


Other stories you might like