Feeds

Virtualization payback, now and in the future

Do you really want to join this pool party?

  • alert
  • submit to reddit

Protecting against web application threats using SSL

Reader Workshop Most people arguably get the point of virtualisation in terms of server consolidation, and the potential reduction in costs and overheads associated with that.

Even though there are some important practicalities to be considered, as highlighted by readers in the first discussion, the game is reasonably well understood, and many seem to be getting on with it.

But does virtualisation have a purpose beyond server consolidation?

In theory, the answer is yes, not least because the fundamental principle of decoupling hardware from software removes many traditional constraints and therefore has the potential to boost both flexibility and responsiveness.

Now you might say we have that already. After all, if we need to provide some horsepower to run a new application for a workgroup or department, it’s no longer necessarily a requirement to go through all the trouble of specifying, procuring and provisioning new hardware. If we have capacity available on an existing server, then we can create and configure a new virtual machine pretty quickly and away we go.

But it should be possible to take things further than this. If we look at the way most virtualisation technologies are deployed today, the allocation of hardware to software is still relatively static, i.e. a specific machine is typically designated to run a specific workload in a given partition. Furthermore, the creation and configuration of virtual machines and the deployment of virtual images is still a manually intensive process.

Of course none of this matters if the nature, level and spread of work across your IT systems doesn’t change that much on an ongoing basis, but the emergence of more dynamic workloads in recent years means this luxurious position is becoming increasingly less common.

More organisations now have public facing Web applications, for example, whether for marketing, sales, support or some other self-service requirement. The load on servers generated by these can fluctuate enormously across a given month, week or even day. Meanwhile, there are quite a few internally facing applications of a more dynamic nature that are increasing in popularity, from broadly deployed business intelligence and analytics, through various forms of collaboration, to full blown unified communications.

Then there are the so called 'situation applications', created on the fly to serve some transient demand, typically by a workgroup or single user, then discarded once their purpose has been served. Such requirements are clearly not new, in that users have been creating 'throw away' and ‘casual’ applications using desktop office tools for years, but with the rise of portals, mashups, social media, etc, they are increasingly expecting such demands to be dealt with online in a sharable manner.

As a result of such trends, the notion of pooling hardware resources and making horsepower available more flexibly on demand, then reclaiming it when the demand disappears or diminishes, has caught many peoples’ imagination. And when we think of enabling technology, we are simply talking about taking virtualisation to the next level. In specific terms, it’s about being able to spin up or close down virtual machines and images very quickly, even automatically, as new workloads appear and disappear, and the processing load in general fluctuates up and down.

Some people use the term 'cloud computing' to refer to such pooling and dynamic provisioning, but without getting into the jargon and marketing speak, we’d be interested in how you see your own requirements in this space developing in the future.

Would this natural evolution of today’s virtualisation solutions be of benefit? If so, where within your business? And how would such capability sit alongside traditional clustering solutions and load balancing offerings that have come out of the Web performance optimisation arena?

Tell us what you think, and throw in any other thoughts you might have on the future of server virtualisation, in the comment area below. ®

The next step in data security

More from The Register

next story
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
Not fit for purpose on day of launch, says Cupertino
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
Protecting users from Firesheep and other Sidejacking attacks with SSL
Discussing the vulnerabilities inherent in Wi-Fi networks, and how using TLS/SSL for your entire site will assure security.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.