Feeds

Standards and interoperability: Are you backing the right horse?

Good and bad, costs and risks

  • alert
  • submit to reddit

Intelligent flash storage arrays

Lab IT can sometimes seem like a long, drawn-out process of making things work with each other. Whether it’s getting back-end systems to exchange information, or trying to open a file that has been sent in an unexpected format, most who work with technology will be familiar with the challenge. But surely standards are supposed to help here, right?

Nobody could doubt their importance: certain facets of computing have settled on particular protocols and formats for example, without which we would not have the Internet or, by extrapolation, the Web. For every standard that succeeds however, many fall by the wayside, and there doesn’t seem to be any link between the amount of effort put into standards development and the likelihood of success.

Backing the wrong standard can be fraught with hazard for both technology vendors and end-users. We might as well get the VHS vs Betamax analogy out of the way; in IT we have equivalents in such areas as local area networking (Ethernet vs token ring) and messaging systems (SMTP vs X.400). It would be good if we could just use what was already standardised, but the trouble with standards is that they never seem to keep up with what either the industry or end-user organisations are wanting to do.

Some just have to happen before adoption – 10 Gigabit Ethernet for example, which brings together storage networking and server networking protocols, and which has to be agreed across manufacturers to make any sense at all. But think of virtualisation right now: wouldn’t it be preferable to have a single standard for an x86 virtual machine? Perhaps, but we don’t, and no organisation is going to wait around for ISO, ECMA or whichever august body might be chosen, to decide on which virtual machine format is to be ‘chosen’.

What should be at the centre of any such decision making is not always whether a standard is completely ‘open’. As well as all the examples of internationally agreed standards, organisations are quite capable and content to accept a proprietary standard (now there’s an apparent oxymoron) if it suits their needs. Java became de facto long before it was endorsed by ISO for example, as did the Portable Document Format (PDF). We can argue the rights and wrongs of whether vendors should be involved in standards negotiations until the cows come home, but ultimately, de facto standards are decided by the market.

Against this background, what becomes most important is the desired level of interoperability between the resulting systems an organisation uses. ‘Interoperability’ is a nebulous term so we won’t try to define it here. Rather, let’s consider the differences between good and poor interoperability, in terms of benefits, costs and risks.

Good interoperability translates into flexibility, in terms of existing and new systems. Requirements change, and IT needs to follow suit – but all too often we find ourselves locked into existing systems, applications or interfaces because they don’t let us do what we want to do. With flexibility comes choice – that is, we have more options as we look to take our IT environments forward. Both can have a financial impact, either in terms of reducing the cost of IT products, or the time we spend writing and maintaining interfaces and building work-arounds.

Inefficiency is the result of poor interoperability, as costs of keeping things up and running, or indeed, adapting systems to meet new requirements, can quickly escalate should the necessary interfaces or standards be absent. It’s worth pointing out that interoperability can easily be broken with a few bad decisions at the design stage.

For example, I remember working with a company that had problems with a certain open source application server, due to a contractor taking it upon himself to extend the code in order to add a few custom enhancements. You can imagine the wasted effort that went into retrofitting the ‘enhancements’ into subsequent releases, particularly once he no longer worked for the organisation.

Interoperability is desirable but it is not an absolute, just as it is not possible to design a single device that can meet every need. There will always be a place for proprietary solutions, particularly if they work extremely well, and the value derived from them is significantly greater than any ‘interoperable’ equivalent. Equally, commercial vendors answerable to their shareholders (that’s just about all of them) will always be looking after number one – interoperability and standards are a means to a largely financial end.

On this note, it is important for end-user organisations to be discerning, not to mention a little suspicious when it comes to matters of interoperability. Vendors across the board will claim their own products work ‘better together’ (with the implication that running a multi-vendor, best-of-breed environment will never be as good) but you should keep an eye on both the short-and long-term costs of any lock-in this implies. This is the case from traditional application stacks (“our app works better on our database”), and newer online models, many of which still pay scant regard to matters of data accessibility or portability.

Due diligence is key when buying and deploying IT systems and services, in terms of both what you need a system to do now, and what you might need it to do in the future. A few questions asked early on around interoperability can go a long way; otherwise, by the time you find you have backed the wrong horse, it may be too late to do much about it. ®

Intelligent flash storage arrays

More from The Register

next story
Download alert: Nearly ALL top 100 Android, iOS paid apps hacked
Attack of the Clones? Yeah, but much, much scarier – report
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Microsoft: Your Linux Docker containers are now OURS to command
New tool lets admins wrangle Linux apps from Windows
Soz, web devs: Google snatches its Wallet off the table
Killing off web service in 3 months... but app-happy bonkers are fine
First in line to order a Nexus 6? AT&T has a BRICK for you
Black Screen of Death plagues early Google-mobe batch
Whistling Google: PLEASE! Brussels can only hurt Europe, not us
And Commish is VERY pro-Google. Why should we worry?
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.