Feeds

Standards and interoperability: Are you backing the right horse?

Good and bad, costs and risks

  • alert
  • submit to reddit

Remote control for virtualized desktops

Lab IT can sometimes seem like a long, drawn-out process of making things work with each other. Whether it’s getting back-end systems to exchange information, or trying to open a file that has been sent in an unexpected format, most who work with technology will be familiar with the challenge. But surely standards are supposed to help here, right?

Nobody could doubt their importance: certain facets of computing have settled on particular protocols and formats for example, without which we would not have the Internet or, by extrapolation, the Web. For every standard that succeeds however, many fall by the wayside, and there doesn’t seem to be any link between the amount of effort put into standards development and the likelihood of success.

Backing the wrong standard can be fraught with hazard for both technology vendors and end-users. We might as well get the VHS vs Betamax analogy out of the way; in IT we have equivalents in such areas as local area networking (Ethernet vs token ring) and messaging systems (SMTP vs X.400). It would be good if we could just use what was already standardised, but the trouble with standards is that they never seem to keep up with what either the industry or end-user organisations are wanting to do.

Some just have to happen before adoption – 10 Gigabit Ethernet for example, which brings together storage networking and server networking protocols, and which has to be agreed across manufacturers to make any sense at all. But think of virtualisation right now: wouldn’t it be preferable to have a single standard for an x86 virtual machine? Perhaps, but we don’t, and no organisation is going to wait around for ISO, ECMA or whichever august body might be chosen, to decide on which virtual machine format is to be ‘chosen’.

What should be at the centre of any such decision making is not always whether a standard is completely ‘open’. As well as all the examples of internationally agreed standards, organisations are quite capable and content to accept a proprietary standard (now there’s an apparent oxymoron) if it suits their needs. Java became de facto long before it was endorsed by ISO for example, as did the Portable Document Format (PDF). We can argue the rights and wrongs of whether vendors should be involved in standards negotiations until the cows come home, but ultimately, de facto standards are decided by the market.

Against this background, what becomes most important is the desired level of interoperability between the resulting systems an organisation uses. ‘Interoperability’ is a nebulous term so we won’t try to define it here. Rather, let’s consider the differences between good and poor interoperability, in terms of benefits, costs and risks.

Good interoperability translates into flexibility, in terms of existing and new systems. Requirements change, and IT needs to follow suit – but all too often we find ourselves locked into existing systems, applications or interfaces because they don’t let us do what we want to do. With flexibility comes choice – that is, we have more options as we look to take our IT environments forward. Both can have a financial impact, either in terms of reducing the cost of IT products, or the time we spend writing and maintaining interfaces and building work-arounds.

Inefficiency is the result of poor interoperability, as costs of keeping things up and running, or indeed, adapting systems to meet new requirements, can quickly escalate should the necessary interfaces or standards be absent. It’s worth pointing out that interoperability can easily be broken with a few bad decisions at the design stage.

For example, I remember working with a company that had problems with a certain open source application server, due to a contractor taking it upon himself to extend the code in order to add a few custom enhancements. You can imagine the wasted effort that went into retrofitting the ‘enhancements’ into subsequent releases, particularly once he no longer worked for the organisation.

Interoperability is desirable but it is not an absolute, just as it is not possible to design a single device that can meet every need. There will always be a place for proprietary solutions, particularly if they work extremely well, and the value derived from them is significantly greater than any ‘interoperable’ equivalent. Equally, commercial vendors answerable to their shareholders (that’s just about all of them) will always be looking after number one – interoperability and standards are a means to a largely financial end.

On this note, it is important for end-user organisations to be discerning, not to mention a little suspicious when it comes to matters of interoperability. Vendors across the board will claim their own products work ‘better together’ (with the implication that running a multi-vendor, best-of-breed environment will never be as good) but you should keep an eye on both the short-and long-term costs of any lock-in this implies. This is the case from traditional application stacks (“our app works better on our database”), and newer online models, many of which still pay scant regard to matters of data accessibility or portability.

Due diligence is key when buying and deploying IT systems and services, in terms of both what you need a system to do now, and what you might need it to do in the future. A few questions asked early on around interoperability can go a long way; otherwise, by the time you find you have backed the wrong horse, it may be too late to do much about it. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Nexus 7 fandroids tell of salty taste after sucking on Google's Lollipop
Web giant looking into why version 5.0 of Android is crippling older slabs
Be real, Apple: In-app goodie grab games AREN'T FREE – EU
Cupertino stands down after Euro legal threats
Download alert: Nearly ALL top 100 Android, iOS paid apps hacked
Attack of the Clones? Yeah, but much, much scarier – report
SLURP! Flick your TONGUE around our LOLLIPOP – Google
Android 5 is coming – IF you're lucky enough to have the right gadget
Microsoft: Your Linux Docker containers are now OURS to command
New tool lets admins wrangle Linux apps from Windows
Bada-Bing! Mozilla flips Firefox to YAHOO! for search
Microsoft system will be the default for browser in US until 2020
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Choosing a cloud hosting partner with confidence
Download Choosing a Cloud Hosting Provider with Confidence to learn more about cloud computing - the new opportunities and new security challenges.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.