Feeds

Standards and interoperability: Are you backing the right horse?

Good and bad, costs and risks

  • alert
  • submit to reddit

HP ProLiant Gen8: Integrated lifecycle automation

Lab IT can sometimes seem like a long, drawn-out process of making things work with each other. Whether it’s getting back-end systems to exchange information, or trying to open a file that has been sent in an unexpected format, most who work with technology will be familiar with the challenge. But surely standards are supposed to help here, right?

Nobody could doubt their importance: certain facets of computing have settled on particular protocols and formats for example, without which we would not have the Internet or, by extrapolation, the Web. For every standard that succeeds however, many fall by the wayside, and there doesn’t seem to be any link between the amount of effort put into standards development and the likelihood of success.

Backing the wrong standard can be fraught with hazard for both technology vendors and end-users. We might as well get the VHS vs Betamax analogy out of the way; in IT we have equivalents in such areas as local area networking (Ethernet vs token ring) and messaging systems (SMTP vs X.400). It would be good if we could just use what was already standardised, but the trouble with standards is that they never seem to keep up with what either the industry or end-user organisations are wanting to do.

Some just have to happen before adoption – 10 Gigabit Ethernet for example, which brings together storage networking and server networking protocols, and which has to be agreed across manufacturers to make any sense at all. But think of virtualisation right now: wouldn’t it be preferable to have a single standard for an x86 virtual machine? Perhaps, but we don’t, and no organisation is going to wait around for ISO, ECMA or whichever august body might be chosen, to decide on which virtual machine format is to be ‘chosen’.

What should be at the centre of any such decision making is not always whether a standard is completely ‘open’. As well as all the examples of internationally agreed standards, organisations are quite capable and content to accept a proprietary standard (now there’s an apparent oxymoron) if it suits their needs. Java became de facto long before it was endorsed by ISO for example, as did the Portable Document Format (PDF). We can argue the rights and wrongs of whether vendors should be involved in standards negotiations until the cows come home, but ultimately, de facto standards are decided by the market.

Against this background, what becomes most important is the desired level of interoperability between the resulting systems an organisation uses. ‘Interoperability’ is a nebulous term so we won’t try to define it here. Rather, let’s consider the differences between good and poor interoperability, in terms of benefits, costs and risks.

Good interoperability translates into flexibility, in terms of existing and new systems. Requirements change, and IT needs to follow suit – but all too often we find ourselves locked into existing systems, applications or interfaces because they don’t let us do what we want to do. With flexibility comes choice – that is, we have more options as we look to take our IT environments forward. Both can have a financial impact, either in terms of reducing the cost of IT products, or the time we spend writing and maintaining interfaces and building work-arounds.

Inefficiency is the result of poor interoperability, as costs of keeping things up and running, or indeed, adapting systems to meet new requirements, can quickly escalate should the necessary interfaces or standards be absent. It’s worth pointing out that interoperability can easily be broken with a few bad decisions at the design stage.

For example, I remember working with a company that had problems with a certain open source application server, due to a contractor taking it upon himself to extend the code in order to add a few custom enhancements. You can imagine the wasted effort that went into retrofitting the ‘enhancements’ into subsequent releases, particularly once he no longer worked for the organisation.

Interoperability is desirable but it is not an absolute, just as it is not possible to design a single device that can meet every need. There will always be a place for proprietary solutions, particularly if they work extremely well, and the value derived from them is significantly greater than any ‘interoperable’ equivalent. Equally, commercial vendors answerable to their shareholders (that’s just about all of them) will always be looking after number one – interoperability and standards are a means to a largely financial end.

On this note, it is important for end-user organisations to be discerning, not to mention a little suspicious when it comes to matters of interoperability. Vendors across the board will claim their own products work ‘better together’ (with the implication that running a multi-vendor, best-of-breed environment will never be as good) but you should keep an eye on both the short-and long-term costs of any lock-in this implies. This is the case from traditional application stacks (“our app works better on our database”), and newer online models, many of which still pay scant regard to matters of data accessibility or portability.

Due diligence is key when buying and deploying IT systems and services, in terms of both what you need a system to do now, and what you might need it to do in the future. A few questions asked early on around interoperability can go a long way; otherwise, by the time you find you have backed the wrong horse, it may be too late to do much about it. ®

The Power of One eBook: Top reasons to choose HP BladeSystem

More from The Register

next story
Apple fanbois SCREAM as update BRICKS their Macbook Airs
Ragegasm spills over as firmware upgrade kills machines
HIDDEN packet sniffer spy tech in MILLIONS of iPhones, iPads – expert
Don't panic though – Apple's backdoor is not wide open to all, guru tells us
NO MORE ALL CAPS and other pleasures of Visual Studio 14
Unpicking a packed preview that breaks down ASP.NET
Captain Kirk sets phaser to SLAUGHTER after trying new Facebook app
William Shatner less-than-impressed by Zuck's celebrity-only app
Cheer up, Nokia fans. It can start making mobes again in 18 months
The real winner of the Nokia sale is *drumroll* ... Nokia
Mozilla fixes CRITICAL security holes in Firefox, urges v31 upgrade
Misc memory hazards 'could be exploited' - and guess what, one's a Javascript vuln
EU dons gloves, pokes Google's deals with Android mobe makers
El Reg cops a squint at investigatory letters
Chrome browser has been DRAINING PC batteries for YEARS
Google is only now fixing ancient, energy-sapping bug
prev story

Whitepapers

Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Reducing security risks from open source software
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Consolidation: the foundation for IT and business transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.