Feeds

Hyperscaling gives you power when you need it

How to grow and shrink like Alice

Internet Security Threat Report 2014

Distance no object

Glazemakers's reference to SDN is crucial, because it is the main difference between grid and hyperscaling.

Even when grid was in its infancy, vendors did cunning things such as taking dissimilar hardware and making it look similar by whacking a Java Runtime Environment (JRE) on top of it. The software doing the work neither knew nor cared about the hardware because it just ran on the JRE.

Admittedly, you had to do a lot of tricky manual work to make the devices communicate with each other, particularly when they were on different sites (for example, at a bunch of collaborating universities sharing compute power).

But we have now reached the point where SDN and its peers are allowing us to bridge that gap.

Two organisations at opposite ends of the internet can implement virtual machines on their sites which think they are on the same subnet despite being multiple Layer 3 hops apart. The SDN layer is doing some funky work to make the network behave like a virtual Layer 2 switch instead of a routed Layer 3 WAN.

So where each grid processing task in the old days tended to be specific to one project or chunk of work, the platform can now be more multi-purpose.

“With the software and virtualisation approach it becomes easier and quicker to scale up or down. Grid computing from the old days was typically designed to scale up or down for specific tasks,” says Glazemakers.

“With the current technology, there are hardly any limits on the potential use cases. It is the biggest difference.”

It's just an illusion

David Noguer Bau, head of service provider marketing EMEA at Juniper, seems to think along the same lines.

“Cloud and grid computing developed models to scale largely a number of processes (cloud) or split a process in parallel computing model (grid). But both lacked interaction with the network,” he says.

“SDN provides to cloud (via integration with the orchestrator) a way to evolve the network configuration as fast as virtual machines do.”

When I quoted Webopedia's definition of hyperscaling, there were a few words missing: an almost throwaway clause in the last sentence says: [Hyperscale] is commonly associated with platforms like Apache Hadoop.

Now Hadoop, a software framework for building distributed applications, has been around since 2005. Distributed computing is, after all, a long-established concept. Is hyperscaling anything new, then? In a word, no.

We can take an application and run it on a set of machines in different locations

The point is, though, that SDN provides us with new and far simpler ways of achieving it. To implement hyperscale using Hadoop you need to architect your software using its framework. With SDN you may not even need to do anything special at all with your software.

It gives us a world where we can take an application that is designed to run on multiple servers near each other and run it on a set of machines in different locations because SDN makes it think that it is on a local network.

So long as you are not constrained by the laws of physics (even the smartest SDN implementation can't give you a sub-millisecond round-trip time between London and Glasgow, though with protocol spoofing it can have a go some of the time), SDN will make distributed computing easier and easier.

Anyone can do it

As distributed computing gets easier, so it becomes possible to scale your compute resources on demand outside your infrastructure and in someone else's – your cloud provider or a higher-tier service provider whose kit you dip into when you run out of power in your network.

Although hyperscaling concepts have been around for a while, SDN is taking us a big step forward in being able to do hyperscaling more flexibly, faster and with considerably less expertise.

With many network hardware vendors supporting SDN concepts in their routers and switches, and the virtualisation layer manufacturers supporting the standards within the layers of the enterprise that they are providing, hyperscaling is becoming open to all of us.

Whether it will be widely adopted or remain a niche concept remains to be seen, of course.

Williams sums it up. “I think SDN use will grow significantly within the data centre and service provider networks over the next one to three years, particularly in the area of orchestration through frameworks such as OpenStack with open APIs, and programmatic control through OpenFlow,” he says.

“Hyperscaling will increase – but to what level is hard to define. What is clear is that SDN will be a key enabler in managing large-scale combinations of compute resources.” ®

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
Symantec backs out of Backup Exec: Plans to can appliance in Jan
Will still provide support to existing customers
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.