This article is more than 1 year old

Behind the curve: How not to be a technology laggard

Be fast but not first, says Dave Cartwright

I came across a wonderful term the other day: the “technical laggard”. I hadn't actually realised that this was an accepted term that's in (reasonably) wide usage – turns out it's the name given to people at the opposite end of the technology take-up spectrum from the early adopters.

Although I've worked for, and with, largely innovative companies in recent years, I started my career in a traditional engineering company. Hence I've come across my share of people whose favourite phrases are “Oh, that's not how we do it” and “It's too risky to adopt bleeding-edge technology”.

But is being a laggard a bad thing? Surely there's some merit in being cautious and not running headlong into a train wreck of hideously dysfunctional beta-test catastrophe? Well, yes there is. But as with all things it's relative.

I once worked for an organisation where I was in the fast-moving technology department and we were able to push the boundaries. So while the central service division persevered with its RS-232 terminal connections into the minicomputers we used, my lot were starting to use Telnet on an Ethernet network.

We were using MacTCP (remember that?) and the Clarkson Packet Driver, and it worked great. We had a big, flat, organisation-wide IP network that was the broadcast domain from Hell, but one of my colleagues found an Open Source bridge package that we could run on an old PC with (if memory serves) a pair of 3C509 LAN cards.

So the bits of IP that did exist weren't able to bombard our world with their broadcast junk. Yes, the central service had to provide a stable service to the whole organisation … but actually the “newfangled” IP setup in our bit of the site was pretty solid and a shedload faster.

Everywhere in life you come across people you could call laggards. Let's face it, no matter what new concept comes around there are – by nature – those who take it up first, those who take it up last, and those who are somewhere in the middle.

But in technology if you do it first, it often means that you're getting the benefit of a funky new technology early on and hence you get more value from it than the people who eventually succumb and reluctantly take it up.

Decisions, decisions

Here's a thought, though: sometimes you can look like a laggard when actually you're not. I used to be the Mac support guy at a university in the 1990s, and the biggest dilemma among the researchers and faculty was whether to buy a new Mac or wait.

They'd be sitting there with their PowerBook Duo 210s, and would have let the launch of the Duo 230 amble by unnoticed … but then the Duo 250 was released. “Ooooh, do I go for the 250 or do I wait for the new one they're promising that's meant to have a colour screen?”. And then the Duo 270c came out but then rumour control promised the new 280c and the mythical PowerPC-based 2300c.

The stress was palpable – they couldn't afford to keep buying new Mac after new Mac, so there was constant mental torture over whether to go with the new release or wait for the next one which promised to be significantly better.

The point is, of course, that they weren't actually being laggards in the true sense – they were in fact determined to be early adopters but ended up using prehistoric technology simply because they couldn't quite decide which generation of equipment to be an early adopter of.

But what if it is an also ran?

There's another defence that laggards can use: will a new technology really take off, or will I end up with a Betamax video, an Apple Newton and a Sun JavaStation – all of which I can read about on my Google Glass?

History is littered with stuff that never quite made it, and there's no doubt that people lost their jobs for adopting technology that sounded like a good idea at the time.

For instance back in the mid-90s I was at a trade show in Washington DC where IBM launched its new 25Mbit/s ATM switch ($353 per port, if I remember correctly) which was going to change the world by providing bandwidth-guaranteed multimedia to the desktop. Hmmm, I think not.

Oh, and there's the other obvious defence about not going for something when it first appears: Microsoft Anything Service Pack 0. Service packs exist for a reason – namely that regardless of the extent of the alpha and beta programmes, the first release of a big software package (and not just a Microsoft one) is generally not a great thing to be adopting.

Start getting the benefits as soon as release 2

But let's go to the other extreme: those people who are sticking with Windows Vista despite Windows 7 having been out for over five years. The ones who daren't upgrade from SQL Server 2005 because they're terrified their home-grown stored procedures won't map to SQL Server 2014 very well.

The ones who won't go to Flash storage because of the concerns about wear and the uncertainty of which of the seemingly endless list of SSD variations is going to have the most longevity. These are the people whose IT infrastructures are gradually rotting and will continue to do so until they decay beyond repair and start to fall apart.

But the point is this: being a laggard is generally a bad thing, but so is being the first to adopt something. Where you actually want to be is somewhere between the two, and I would advocate being just after the early adopters so that you let some other idiot find the bugs in release 1 but you start getting the benefits as soon as release 2 (or Service Pack 1 for release 1) comes out.

After all, as the saying goes: the early bird gets the worm, but the second mouse gets the cheese. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like