Feeds

HPC 2.0: The Monster Mash-up

Big Data. Oh yes

Choosing a cloud hosting partner with confidence

Pt 1. Blog IBM recently invited a handful of really smart HPC-centric industry analysts (and me too, for no apparent reason) to spend the day talking about where the market is going and how IBM intends to address it.

It was truly a conversation, rather than the typical vendor PowerPoint-palooza where they simply run through every product slide deck they can get their hands on.

One of the major threads running thorough the various presentations and conversations is the convergence of “traditional” HPC/supercomputing, analytics, and mainstream computing. If you’re reading the industry press, you’ll see this trend referred to as ‘Big Data’, ‘Business Intelligence’, or "Predictive Analytics".

These terms are bandied about as if they’re interchangeable: they aren’t. And as if all mean the same thing: they don't. I’m not innocent of sowing name confusion; I’ve been using the term 'HPC 2.0' to describe the increasing use of HPC-like methods and infrastructure in non-HPC organizations.

Whatever you call it, it’s happening, and it will impose ever increasing demands on the traditional business data center. The amount of data that organizations will gather and try to analyze is mind-boggling.

In addition to a greater flow of data generated organically by the organization, the best companies are casting their nets wide in an attempt to bring in even more data from outside sources (social network mentions are one example). Also sensor technology is now very cheap and will be increasingly deployed to monitor or track, well, pretty much anything.

When Big Data gets big, data centers should get nervous

This all adds up to an increasingly large volume of data that needs to be sorted, stored, and, yes, analyzed. At a high level, there is somewhere close to a zettabyte (which is 1,000 exabytes or a 1.07 billion terabytes) in digital data floating around today. More than 15 petabytes of new data is created daily – data that will in some way be analyzed by someone to figure out whether it presents an opportunity or a threat.

So data storage will be a challenge, sure, but storage has never been less expensive, and we can always get more. But the challenges get more challenge-y when you realize that this isn’t just archival data that can be stored away and forgotten.

This data, no matter how obscure or routine, could very well be used as grist for the enterprise analytics mill. The data center will have people on the business side of the organization asking, nay demanding, instant access to data that heretofore was filed away and forgotten.

They’ll also ask for, nay demand, systems that can quickly crunch through reams of data and deliver answers to complex questions in real or near-real time.

In my next installment, I will discuss what these analytic workloads look like, how they act, and how to best architect an infrastructure to handle them. ®

Intelligent flash storage arrays

More from The Register

next story
Facebook pays INFINITELY MORE UK corp tax than in 2012
Thanks for the £3k, Zuck. Doh! you're IN CREDIT. Guess not
Big Content outs piracy hotbeds: São Paulo, Beijing ... TORONTO?
MPAA calls Canadians a bunch of bootlegging movie thieves
Google Glassholes are UNDATEABLE – HP exec
You need an emotional connection, says touchy-feely MD... We can do that
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
US court SHUTS DOWN 'scammers posing as Microsoft, Facebook support staff'
Netizens allegedly duped into paying for bogus tech advice
Feds seek potential 'second Snowden' gov doc leaker – report
Hang on, Ed wasn't here when we compiled THIS document
Verizon bankrolls tech news site, bans tech's biggest stories
No agenda here. Just don't ever mention Net neutrality or spying, ok?
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.
Getting ahead of the compliance curve
Learn about new services that make it easy to discover and manage certificates across the enterprise and how to get ahead of the compliance curve.