Interoperability: standards or proprietary?
Between the devil and the deep blue sea
Workshop ‘Choice’ is one of those words that gets bandied around in all kinds of ways, without anyone really thinking about if it is appropriate. We all want choice, so we are told, whether it’s to do with the schools where we send our kids, or the ever-increasing range of products on supermarket shelves. In IT we talk about ‘best of breed solutions’ as a demonstration of how clever we are at selecting the right tools for the job.
In the real world however, decision-making is more complicated than idealistic visions of absolute choice try to suggest. New technologies are chosen through a series of compromises, balancing current requirements with the actual capabilities of the tools.
The old adage is that people only ever use 20 per cent of the capabilities of Microsoft Office – the trouble is, nobody uses the same 20 per cent.
Desktop environments have the potential of being quite simple, particularly given that many business users require access to a quite limited set of applications and services. But desktop application environments are a house of cards and changes in one element can cause unexpected difficulties in others.
It’s easy to berate software vendors for releasing frequent updates, but this links directly to the complex dependencies caused by people expressing freedom of choice in the past.
The tyranny of choice
Given the complex dependencies, there are few options. First, many organisations appear quite happy to forgo choice, defining themselves by the vendors or technologies they depend upon – we hear of ‘Microsoft shops’ or ‘Oracle shops’, for example.
A few years ago I got into trouble at an event, when I declared that no organisation wanted to be a Microsoft shop. In a scene reminiscent of ‘We are all individuals’ in Monty Python’s Life of Brian, a voice from the back declared: “Actually, we do!” The reason, simply given, was that there was more chance of everything working together.
We can see a similar phenomenon in the Apple world today, where many individuals seem quite happy to hand over the keys of freedom to a third party, in exchange for peace of mind. As an Apple-philiac friend said a few days ago after struggling to get some graphics card drivers working on his Windows machine: “Macs may crash but at least I don’t get this hassle!”
Like domestic cats, we are all prepared to give up a little freedom for the promise of an easier life – so long as it is on our own terms. Vendors can be tempted to take this too far, as we are currently seeing in the healthy scepticism around proprietary stacks.
In the corporate environment, it is possible to standardise certain elements but it isn’t ever going to be possible to lock everything down. From a user perspective, incompatibilities between applications can result in ‘productivity issues’ – also known as time-wasting, as people try to make things work, or get information from one application to another. If taking the proprietary route isn’t the answer, could we maybe look to industry standards?
It would be lovely to say yes, and indeed, some of the greatest strides made by this industry – the internet, for example – have been on the back of opening up standardised interfaces and protocols. The downside is that such standards often lag the innovation curve however, and we have seen many attempts at standardisation fall by the wayside.
It could be argued that the rapid advance of said internet has taken us to unexpected places without appropriate standards keeping up (for example around security), giving us some of the challenges internet users face today. As another example, the X.500 directory standard would be all but defunct were it not redefined in terms of LDAP, which subsequently became the de facto protocol for directory integration.
Lowering the standard
As we have seen on frequent occasions, neither vendors nor customers are willing to wait for relevant standards to emerge and be ratified, or existing standards to be updated, before selling or adopting new and innovative solutions.
HTML5 is the latest example of a standard-in-progress that has been perceived as ‘good enough’. A couple of years ago we saw the ratification of several standards around office document formats such as ODF and OOXML: these addressed certain challenges around procurement and archiving in the public sector, but they have had little impact elsewhere.
Right now we can see some new choices opening up around cloud-based services, and with them, needs for standards on how services interact and how data can be moved from one place to another.
If an online email service is being used for example, are emails being backed up – and can such email services integrate with (say) online CRM systems? There is no right or wrong answer, but this just illustrates some of the questions that need to be asked.
‘Choice’ is very much a trade-off between the complexities that open-ness can bring on the one hand, and the restrictiveness of being locked into any one vendor’s offerings on the other.
The worst thing any organisation can do is to fall into the trap of believing there is only one way of doing things, and following such a path without first evaluating the needs of the different groups making up its own user base.
One thing that we certainly have no choice about, is that we can’t just offload responsibility and assume any solution will just work, without considering its suitability for, and impact on our own organisation.
Sponsored: Network DDoS protection