Feeds

HPC 2.0: The Monster Mash-up

Big Data. Oh yes

Remote control for virtualized desktops

Pt 1. Blog IBM recently invited a handful of really smart HPC-centric industry analysts (and me too, for no apparent reason) to spend the day talking about where the market is going and how IBM intends to address it.

It was truly a conversation, rather than the typical vendor PowerPoint-palooza where they simply run through every product slide deck they can get their hands on.

One of the major threads running thorough the various presentations and conversations is the convergence of “traditional” HPC/supercomputing, analytics, and mainstream computing. If you’re reading the industry press, you’ll see this trend referred to as ‘Big Data’, ‘Business Intelligence’, or "Predictive Analytics".

These terms are bandied about as if they’re interchangeable: they aren’t. And as if all mean the same thing: they don't. I’m not innocent of sowing name confusion; I’ve been using the term 'HPC 2.0' to describe the increasing use of HPC-like methods and infrastructure in non-HPC organizations.

Whatever you call it, it’s happening, and it will impose ever increasing demands on the traditional business data center. The amount of data that organizations will gather and try to analyze is mind-boggling.

In addition to a greater flow of data generated organically by the organization, the best companies are casting their nets wide in an attempt to bring in even more data from outside sources (social network mentions are one example). Also sensor technology is now very cheap and will be increasingly deployed to monitor or track, well, pretty much anything.

When Big Data gets big, data centers should get nervous

This all adds up to an increasingly large volume of data that needs to be sorted, stored, and, yes, analyzed. At a high level, there is somewhere close to a zettabyte (which is 1,000 exabytes or a 1.07 billion terabytes) in digital data floating around today. More than 15 petabytes of new data is created daily – data that will in some way be analyzed by someone to figure out whether it presents an opportunity or a threat.

So data storage will be a challenge, sure, but storage has never been less expensive, and we can always get more. But the challenges get more challenge-y when you realize that this isn’t just archival data that can be stored away and forgotten.

This data, no matter how obscure or routine, could very well be used as grist for the enterprise analytics mill. The data center will have people on the business side of the organization asking, nay demanding, instant access to data that heretofore was filed away and forgotten.

They’ll also ask for, nay demand, systems that can quickly crunch through reams of data and deliver answers to complex questions in real or near-real time.

In my next installment, I will discuss what these analytic workloads look like, how they act, and how to best architect an infrastructure to handle them. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
I'll be back (and forward): Hollywood's time travel tribulations
Quick, call the Time Cops to sort out this paradox!
Musicians sue UK.gov over 'zero pay' copyright fix
Everyone else in Europe compensates us - why can't you?
Megaupload overlord Kim Dotcom: The US HAS RADICALISED ME!
Now my lawyers have bailed 'cos I'm 'OFFICIALLY' BROKE
MI6 oversight report on Lee Rigby murder: US web giants offer 'safe haven for TERRORISM'
PM urged to 'prioritise issue' after Facebook hindsight find
BT said to have pulled patent-infringing boxes from DSL network
Take your license demand and stick it in your ASSIA
Right to be forgotten should apply to Google.com too: EU
And hey - no need to tell the website you've de-listed. That'll make it easier ...
prev story

Whitepapers

Go beyond APM with real-time IT operations analytics
How IT operations teams can harness the wealth of wire data already flowing through their environment for real-time operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.