Feeds

Is more bandwidth really the answer?

Money can't fix everything

Top 5 reasons to deploy VMware with Tegile

A former university lecturer used to say, "any problem can be beaten to death with pound notes". Many network managers looking at the impact of voice over IP on their networks might be thinking along similar lines, but is more bandwidth the answer to the performance and quality issues they are currently, or soon to be, suffering?

It really depends what you're trying to measure. We all often fall into the trap of making important what we've measured, rather than measuring what's important, and networks management tools measure the things that computers, routers, and other network components see - packets, loss, and throughput. That's great for many "bit-shunting" applications, but not so good for those that have a critical interactive impact on the end user, especially those based around the human senses of sound and sight.

The quality of voice and video is dependent on many more things than the network connection. In particular, the final arbiter is the quality when the information hits the eye and ear. We know from recent Quocirca research into collaboration that the quality of visual and audio information is a major factor in the effective use of conferencing systems.

There are many areas that impact quality, and the old IT expression of GIGO (Garbage In, Garbage Out) can be applied when choosing microphones and cameras. Likewise, at the receiving end, the qualities of screens, speakers, and headsets play critical roles in making communication clearer, crisper, and easier for the receiver. However, these are fixed issues that depend on balancing spend levels with fidelity - choose poor quality and you get what you pay for, choose good quality and it will consistently deliver.

But only if the network does its bit. And this is where the problem lies, and it will become increasingly apparent as more and more applications propel their bits through a myriad of networks and connections. Especially those applications that depend on how they look, sound and feel to the user.

The problem is that those responsible for the network are often measuring the wrong things. It seems to be performing fine, within tolerances and meeting SLAs, but the users aren't happy. The network is being measured, but not the applications that run over it.

With some applications, it's relatively easy to measure response times - round trips for key presses, time to update a screen, time to retrieve search results - and understand their impact on users. Indeed, it's also possible to fool user perceptions of speed, for example the web browsers rendering text while waiting for the slower images to download. Or, as Apple cleverly did with its earlier windowing system, using classic animation techniques like motion lines and blurring to make it appear that windows were popping up faster than they actually were.

The problem with interactive voice and video is that while some gaps and glitches can be masked, they can quickly become irritating and a drain on concentration, individual efficiency and ultimately overall value of the communication. With more voice and video heading down the converged IP pipe, simply throwing more bandwidth at the quality issue, or trying to dodge it by saying, "well the network's working fine", will not be acceptable.

That would be bad enough, but many organisations are looking for digital media investments to deliver better quality that their original analogue systems. Higher fidelity phone calls from the Pretty Awesome New Stuff (PANS) than delivered by the Plain Old Telephone System (POTS), and higher definition video instead of the decades old analogue TV standards. That's the real progress the digital revolution surely promised?

More and higher fidelity network traffic will need to be measured at the application level. Ask those who use the applications, find a way to understand their expectations, and a way to measure performance and quality in the way they appreciate, rather than in the obscure technicalities of the network. It might not be on the IT network management dashboard today, but the commercial considerations of quality of communication in a business world dependent on global reach, but increasingly environmentally penalised by global travel, will soon put it on there.

Copyright © 2007, Quocirca

Beginner's guide to SSL certificates

More from The Register

next story
Facebook pays INFINITELY MORE UK corp tax than in 2012
Thanks for the £3k, Zuck. Doh! you're IN CREDIT. Guess not
DOUBLE BONK: Testy fanbois catch Apple Pay picking pockets
Users wail as tapcash transactions are duplicated
Happiness economics is bollocks. Oh, UK.gov just adopted it? Er ...
Opportunity doesn't knock; it costs us instead
Google Glassholes are UNDATEABLE – HP exec
You need an emotional connection, says touchy-feely MD... We can do that
YARR! Pirates walk the plank: DMCA magnets sink in Google results
Spaffing copyrighted stuff over the web? No search ranking for you
In the next four weeks, 100 people will decide the future of the web
While America tucks into Thanksgiving turkey, the world will be taking over the net
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.