Feeds

IT salaries: Why you are a clapped-out Ferrari

Skills shortages? The thing in short supply is cash

Top three mobile application threats

As a tech careers writer I regularly receive noise about the UK IT “skills shortage", which makes as much sense as saying there’s a shortage of Ferraris.

I know this because, according to Blighty's Office for National Statistics, the average weekly pre-tax pay in “computer programming, consultancy and related activities” in 2012 was £718.70, which is about the same as a decent Ferrari's cost of ownership.

Ferrari, like any IT professional, will charge what it can get. This will be pushed up by demand and pushed down by competition until a rough balance is met. Anything in a short supply should really command at least a 25 per cent premium; compare the above figure to the national earnings average of £519 a week.

Yet, the most persistent PR spin on the "skills shortage" is from the games sector, which is notorious for low pay and long hours, so much so that in the US executives have had serious legal issues with minimum wage laws, which helps explain why a pretty low percentage of the work is offshored to India.

Employers naturally want to pay less for labour and since IT makes up at least 10 per cent of all wages in the UK, bosses want to reduce that expenditure. Lawyers average about the same as IT bods, but do we hear of a shortage of them? No, we certainly do not.

Although I’ve had 25 years in the IT trenches I’m now a headhunter, and I will state as a fact that for a price you can have pretty much any damned skill set you want. There exist astronauts who can do C. I know half a dozen helicopter pilots and a lawyer who can do OS/2, as well as a lay preacher who is the best quant developer I know.

You want Java to do image synthesis for photo-quality smut? I only have two of them who’ve worked on Harry Potter films, but give me time and budget and I can get more.

That makes your requirements for someone who knows about online retail and a bit of MySQL appear petty. And I did say “for a price”. If you want good people, you’re going to have to pay.

Ferrari F430. Photo (c) Rudolf Stricker. Used under licence

IT bods: The Ferrari F430s of Blighty (Photo © Rudolf Stricker)

So how did we get here? The following isn’t just a history lesson, it’s economics, and the value to you, dear reader, is that you will see patterns that will help you make better decisions on where to place your own career bets.

Back in the early 1980s a single graph persuaded me to take up programming: it showed that the cost of writing a line of code divided by the cost to execute it had increased over time. For the last 30 years this has continued upwards to the point where it is usually cheaper and easier to upgrade a machine than to pay someone to make the code run faster. Obviously, you want to be the person writing the code.

I recall articles from the 1980s where “experts” explained that SQL was so inefficient that few machines could be made to execute it at an acceptable rate, that and hard-coded Indexed Sequential Access Method (ISAM) code would remain the centre of our data world.

I wrote some Btrieve because management wanted “efficiency”, which we achieved but at the price of my first grey hairs. The cost of computer time was enough that large outfits had specialists who’d take popular batch files or the equivalent of shell scripts and rewrite them in C or Fortran.

When I was unleashed as a newbie computer science graduate onto the market in 1984, the money was very good. I was getting twice what the lawyers in my year earned; Queen Mary, University of London has one of the country's top law departments, so that was pretty damned cool for all of us programmers.

For reasons that seemed clear to me at the time (£8,000 per year was great in 1984 when a pint cost about 70 pence) I went to Nortel to work on Microsoft Unix. Although that did not end well, the market was buoyant enough for me to not really care - that's another thing that's changed. Ironically my lawyer wife has joined the Nortel bankruptcy fight over my pitiful pension “because it’s technically interesting”.

Not a penny more, not a penny less

Pay isn’t just driven by supply and demand but by a general feeling of what a job is worth. In the 1980s we were such rare beasts, and what we did was so close to magic, that we were usually treated with some respect - and not just in monetary terms. This was a time when IT pros often got company cars and some benefits normally reserved for finance and sales executives.

By 1998, our numbers had grown to the point where there were at least 200,000 analyst programmers in the UK with a similar number engaged in operations and hardware.

One fear we had then was that somehow IT might be nearly finished, as in complete. Pretty much everyone who could make use of a PC had one, computers were connected by Visual Basic to a database and workers could drive spreadsheets and process words with four fonts, centre adjustment, bold, italic and three styles of bullet point. As IT pros, we were paid to change things but the world appeared to be happy with what it had, and since few of us saw the web being that big, we feared a future of maintenance - boring and badly paid.

But the money was still good: a VB and SQL contractor pulled in £60,000 per year, almost exactly twice that of a £29,800 permie - and this was seen as the natural order of things. (The price of a pint by now was roughly £1.90. The national average wage was £15,000.)

Then the Millennium Bug caused the IT pay average to increase and the spread to widen. Cobol and Fortran programmers had seen the writing on the wall, or rather the attractive little bitmap on the screen, and many had either retired or jumped onto the Visual Basic bandwagon. Experts forgot to mention their older, less glamorous abilities on their CVs; this resulted in a perceived shortage of those skills, which were needed to deal with the imminent collapse of our civilisation at midnight on 31 December, 1999. Recall how people bought millennium survival kits that included guns.

The Y2K Bug was used by many CIOs to justify upgrades and system rationalisations that were long overdue, generating operations and development work. It also produced a healthy dose of fear since if you had spent years building a corporate system and quit to become a freelancer, the work wasn't guaranteed - the bosses would hire someone with just as much Embedded SQL knowledge as you. They couldn’t get up to speed with Cobol or REXX in time, so a good number of us were paid retention bonuses.

This also marked the last good time to work in testing. When I had worked in software testing we were paid at least as much as ordinary developers because we were expected to outwit them, find devious bugs and win arguments over what was actually meant by “between 1 and 9”. (If you can’t think up 10 different ways, I am a better software QA than you. Hint: some C and Java programmers think it means 0..8.)

Testing is now fighting it out with games development as a badly paid job with crap career progression. This has been partly solved by some agile methodologies that make everyone a developer. Yes, we all know how well that works, but the alternative used by outsourcers, especially those who do government IT work, is to allocate failed programmers and people with serious bugs in their interpersonal skills to testing on the premise that no one likes them anyway.

Thus if you’re a graduate who has been offered a job in testing, know that you failed the interview or the manager is trying to con you into a dead end.

3 Big data security analytics techniques

More from The Register

next story
From corporate bod to startup star: The 10-month gig that changed everything
What I learned as a techie in my time away from globo firms
Facebook snubbed Google's Silicon Valley wage-strangle pact, Sheryl Sandberg claims
Report details letter COO wrote to court addressing 'no-compete deal' lawsuit
Another day, another nasty Android vuln
Memory corruption mess can brick your mobe
Barclays warns freelance techies of DOUBLE DIGIT rate cut
'IT was a car crash before, so this isn't going to get any better' - sources
VMware announces compulsory bi-ennial VCP recertification
Downside: more time and money; Upside: VMware hints at two-yearly release cycle
Sysadmins and devs: Do these job descriptions make any sense?
Industry lobby group defines skills used in 25 common IT jobs
Who earns '$7k a month' but can't even legally drink? A tech intern!
Glassdoor reveals astonishing salaries of Silicon Valley undergrads
Your CIO is now a venture capitalist and you work at their startup
This just happened without you changing job, by the way
Turnover at the top in Oz telco-land as AAPT, Huawei, Optus, lose top brass
Move along, nothing to see here but orderly transitions
prev story

Whitepapers

Designing a defence for mobile apps
In this whitepaper learn the various considerations for defending mobile applications; from the mobile application architecture itself to the myriad testing technologies needed to properly assess mobile applications risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.