Original URL: http://www.theregister.co.uk/2006/07/20/hp_labs_developer/

Virtual – the new reality for developers

New assumptions needed by code cutters

By Martin Banks

Posted in Developer, 20th July 2006 10:04 GMT

“The life of the developer has just become a lot harder,” said Sharad Singhal, Distinguished Technologist HP Labs, Palo Alto, “and the reason is that the assumptions they make about their environment are not necessarily true any more.”

He was talking about the way that the rush towards virtualised systems infrastructures is changing the ground rules to which developers have historically worked. They assume, for example, that the operating system they are writing to is stable, that the amount of memory they have available is static, and that the amount of CPU utilisation is static. “They are hard-coded into the machine, or at least developers assume that at the start of a job a configuration file will tell them.”

But in a world where virtualisation is the norm all of this is changing and developers will have to learn how to optimise their applications to meet the needs of a constantly shifting environment. For example, code and performance optimisation processes will become far more difficult in a shifting, intangible environment. At the same time, if applications run out of capacity, developers will now be presented with alternatives to simply shedding workload. Now they will have the ability to reach out to the environment to request more capacity, and get it on demand.

According to Singhal, developers face making a transition into an environment built on a new set of capabilities, which he likened to working with a new operating system. In that context, virtualisation is easy to dismiss as just another technology hot-topic that will fade away in time, leaving little trace. But a growing number of enterprises are seriously heading in that direction, and many developers are continuing to write code that is not efficient and does not match the environments the code will have to run in. This poses problems for enterprises that are already moving to virtualised environments.

Being able to exploit the flexibility of virtualisation in terms of workload and capacity management is an obvious case in point. “For example,” Singhal said, “such an environment can detect that an application requires more capacity, but the application itself has not been written in a way that can make use of it. On the other hand, capacity may be taken temporarily from an application because a higher priority task requires it, but then that deprived application promptly crashes rather than being able to continue functioning in a degraded manner. What this means is that the development tools we give them are going to have to change over time.”

HP Labs are paying considerable attention to developing technologies and processes that will be required in the infrastructure management area, but in applications it is the company’s partners that are – or should be - working on it. “I am assuming, for example, that when Microsoft starts offering virtual machines that a lot of the Virtual Studio type environments will start recognising that virtual machines exist,” he said. “Developers working in J2EE environments are starting to recognise that virtual environments exist. So these types of capabilities will become available to developers and they will be available inside C libraries and J2EE libraries.”

There is something of a paradox here for developers, in that they will need to get used to the idea of optimising their code to what is, in practical terms, an intangible environment – in effect, thin air. Singhal agreed, and acknowledged that it is an area where service providers like HP need to be more intelligent. But it also offers developers tremendous opportunities and flexibility once they get used to exploiting virtualisation.

“One of the benefits of virtualisation is that developers can carve out for an application the environment that it thinks it needs, based on the assumptions the application will make about the environment it is to operate in,” Singhal said. “But that requires developers to understand what the application needs. Until developers catch up and recognise that they can take advantage of virtualisation capabilities the onus is on those doing the virtualisation to present to the applications things that look legacy environments.”

He suggested that this is where treating the datacentre as a system starts to help, because different applications with different requirements levels are present. That brings the opportunity to look at the total resource pool and workload, and with the right mixing and matching leave the fewest number of gaps in the environment. “And the bigger the datacentre, and the bigger the number of applications, the greater the opportunity to tile them in that way,” he said. “It is a bit like working as a tailor’s cutter.”

He believes that there will, in time, be more regularised frameworks in which applications can be built that fit more easily into a virtualised resource pool, but it will take some time for them to appear. But some SOAs are trying get to that environment now, which will allow the maintenance of modularity and increase the level of loose coupling between applications.

“But usually when I see that type of environment it operates at the expense of efficiency,” he said. “Tightly-coupled systems are more efficient, but also less flexible, so there is a balancing act to perform. But given the increase in performance of computers these days, the flexibility possible with loosely-coupled applications is a good trade against any degradation in performance. And we are starting to pay attention to other metrics besides performance.”®