Original URL: https://www.theregister.com/2013/04/12/it_salary_trends/

IT salaries: Why you are a clapped-out Ferrari

Skills shortages? The thing in short supply is cash

By Dominic Connor

Posted in On-Prem, 12th April 2013 10:04 GMT

As a tech careers writer I regularly receive noise about the UK IT “skills shortage", which makes as much sense as saying there’s a shortage of Ferraris.

I know this because, according to Blighty's Office for National Statistics, the average weekly pre-tax pay in “computer programming, consultancy and related activities” in 2012 was £718.70, which is about the same as a decent Ferrari's cost of ownership.

Ferrari, like any IT professional, will charge what it can get. This will be pushed up by demand and pushed down by competition until a rough balance is met. Anything in a short supply should really command at least a 25 per cent premium; compare the above figure to the national earnings average of £519 a week.

Yet, the most persistent PR spin on the "skills shortage" is from the games sector, which is notorious for low pay and long hours, so much so that in the US executives have had serious legal issues with minimum wage laws, which helps explain why a pretty low percentage of the work is offshored to India.

Employers naturally want to pay less for labour and since IT makes up at least 10 per cent of all wages in the UK, bosses want to reduce that expenditure. Lawyers average about the same as IT bods, but do we hear of a shortage of them? No, we certainly do not.

Although I’ve had 25 years in the IT trenches I’m now a headhunter, and I will state as a fact that for a price you can have pretty much any damned skill set you want. There exist astronauts who can do C. I know half a dozen helicopter pilots and a lawyer who can do OS/2, as well as a lay preacher who is the best quant developer I know.

You want Java to do image synthesis for photo-quality smut? I only have two of them who’ve worked on Harry Potter films, but give me time and budget and I can get more.

That makes your requirements for someone who knows about online retail and a bit of MySQL appear petty. And I did say “for a price”. If you want good people, you’re going to have to pay.

Ferrari F430. Photo (c) Rudolf Stricker. Used under licence

IT bods: The Ferrari F430s of Blighty (Photo © Rudolf Stricker)

So how did we get here? The following isn’t just a history lesson, it’s economics, and the value to you, dear reader, is that you will see patterns that will help you make better decisions on where to place your own career bets.

Back in the early 1980s a single graph persuaded me to take up programming: it showed that the cost of writing a line of code divided by the cost to execute it had increased over time. For the last 30 years this has continued upwards to the point where it is usually cheaper and easier to upgrade a machine than to pay someone to make the code run faster. Obviously, you want to be the person writing the code.

I recall articles from the 1980s where “experts” explained that SQL was so inefficient that few machines could be made to execute it at an acceptable rate, that and hard-coded Indexed Sequential Access Method (ISAM) code would remain the centre of our data world.

I wrote some Btrieve because management wanted “efficiency”, which we achieved but at the price of my first grey hairs. The cost of computer time was enough that large outfits had specialists who’d take popular batch files or the equivalent of shell scripts and rewrite them in C or Fortran.

When I was unleashed as a newbie computer science graduate onto the market in 1984, the money was very good. I was getting twice what the lawyers in my year earned; Queen Mary, University of London has one of the country's top law departments, so that was pretty damned cool for all of us programmers.

For reasons that seemed clear to me at the time (£8,000 per year was great in 1984 when a pint cost about 70 pence) I went to Nortel to work on Microsoft Unix. Although that did not end well, the market was buoyant enough for me to not really care - that's another thing that's changed. Ironically my lawyer wife has joined the Nortel bankruptcy fight over my pitiful pension “because it’s technically interesting”.

Not a penny more, not a penny less

Pay isn’t just driven by supply and demand but by a general feeling of what a job is worth. In the 1980s we were such rare beasts, and what we did was so close to magic, that we were usually treated with some respect - and not just in monetary terms. This was a time when IT pros often got company cars and some benefits normally reserved for finance and sales executives.

By 1998, our numbers had grown to the point where there were at least 200,000 analyst programmers in the UK with a similar number engaged in operations and hardware.

One fear we had then was that somehow IT might be nearly finished, as in complete. Pretty much everyone who could make use of a PC had one, computers were connected by Visual Basic to a database and workers could drive spreadsheets and process words with four fonts, centre adjustment, bold, italic and three styles of bullet point. As IT pros, we were paid to change things but the world appeared to be happy with what it had, and since few of us saw the web being that big, we feared a future of maintenance - boring and badly paid.

But the money was still good: a VB and SQL contractor pulled in £60,000 per year, almost exactly twice that of a £29,800 permie - and this was seen as the natural order of things. (The price of a pint by now was roughly £1.90. The national average wage was £15,000.)

Then the Millennium Bug caused the IT pay average to increase and the spread to widen. Cobol and Fortran programmers had seen the writing on the wall, or rather the attractive little bitmap on the screen, and many had either retired or jumped onto the Visual Basic bandwagon. Experts forgot to mention their older, less glamorous abilities on their CVs; this resulted in a perceived shortage of those skills, which were needed to deal with the imminent collapse of our civilisation at midnight on 31 December, 1999. Recall how people bought millennium survival kits that included guns.

The Y2K Bug was used by many CIOs to justify upgrades and system rationalisations that were long overdue, generating operations and development work. It also produced a healthy dose of fear since if you had spent years building a corporate system and quit to become a freelancer, the work wasn't guaranteed - the bosses would hire someone with just as much Embedded SQL knowledge as you. They couldn’t get up to speed with Cobol or REXX in time, so a good number of us were paid retention bonuses.

This also marked the last good time to work in testing. When I had worked in software testing we were paid at least as much as ordinary developers because we were expected to outwit them, find devious bugs and win arguments over what was actually meant by “between 1 and 9”. (If you can’t think up 10 different ways, I am a better software QA than you. Hint: some C and Java programmers think it means 0..8.)

Testing is now fighting it out with games development as a badly paid job with crap career progression. This has been partly solved by some agile methodologies that make everyone a developer. Yes, we all know how well that works, but the alternative used by outsourcers, especially those who do government IT work, is to allocate failed programmers and people with serious bugs in their interpersonal skills to testing on the premise that no one likes them anyway.

Thus if you’re a graduate who has been offered a job in testing, know that you failed the interview or the manager is trying to con you into a dead end.

Why excellence doesn’t pay: You're not far off the worst coder ever, wage-wise

Measuring the productivity and quality of IT pros is never going to give entirely trustworthy results; having read too many reports on this subject and worked alongside people who vary from excellent to the Worst Programmer in the World, it’s an established fact that when considering the quality of programmers in a sample of (say) 15, the spread of output varies by at least a factor of ten, often by 50, no matter what metric you use, be it lines of useful code per day or frequency of bugs.

Yet figures from the Office for National Statistics imply that, within that sample, pay will vary by about 15 per cent. I’ve worked in teams (including the one with the Worst Programmer) where we were all paid exactly the same, and others where productivity was in no way related to pay.

In 2012, the top-paid job group according to ONS was oil and gas production workers, which is both skilled and dangerous. Although few Powershell accidents led to the death of the user, the pay of IT pro has to be looked at in the context of risk; in this case the risk of not being an IT pro any more.

That risk is on the increase and that’s not just from offshoring, but from the diversity of the technologies we now use and the pickiness of employers. Again taking the mid-1980s as my starting point, the top five languages (Cobol, Fortran, BASIC, RPG and PL/1) accounted for about 90 per cent of programming jobs, with C coming up fast.

Today the top five distinct languages account for about 50 per cent and the tails go a lot further out. Note I say distinct languages. Try for an Oracle SQL job only knowing MS SQL server and see how far you get. Java and C# are so similar that you can cut and paste blocks between them, but you’d struggle to move from knocking out code for one to a job writing the other on a whim. Do you really believe C# or Java will be career-enhancing skills until you retire? Really? If you do then I have a lot of money I’d like you to help me move from Nigeria.

If anything, it is worse in operations. The number of fully distinct operating systems has fallen and stabilised nicely. But your career is being gambled by your employer's choice of backup and system admin software, virtual machine hypervisor and security setup, all of which you are mastering in hope of landing a new job, should you need or want one.

IDC's Backup Tracker to Q4 2012

IDC backup market share: Who's not using EMC? Own up

Take, for example, the career prospects of those of you who’ve sweated over Symantec’s backup toolset and bet your career on the vendor. The graph above (and detailed here) should worry you for the longer term. Is your boss worried that, if you leave, he or she won't be able to find someone with the expertise to replace you - or is there nowhere for you to go anyway?

Women are the future?

According to government statistics, the pay gap in IT between the sexes is far smaller than most other lines of work - which doesn’t prove it is not sexist, simply that it is less than average. In IT, the median pay of women is 15 per cent less than the £739-a-week a typical bloke gets. Compare that to 25 per cent less in the rest of the economy and a 23 per cent gap in the media, which makes all the noise about pay imbalance in other sectors.

Since IT people earn more than the average Brit, a woman in tech can easily be 30 to 40 per cent better off than her sisters in other jobs. I’ve never heard this fact being used to get girls to choose less dippy A levels and degrees.

Although sci-tech quango Nesta is squandering some of its vast budget on a tokenist rule that there must be at least a white middle-class woman with a resemblance to Harriet Harman on all panels at its events, there are some very sharp women in tech - such as Dame Steve Shirley, who in the bar after a Real Time Club Dinner 20 years ago predicted to me that she expected IT as a purely distinct occupation to largely disappear as computing becomes part of all work.

Today that seems very wise. The government claims there are a million IT pros in the UK alone. But a huge percentage of office workers are Microsoft Office "developers", knocking up spreadsheets and badly formed databases. When I wandered into the City in 1986, the single most common qualification for traders was a successful spell at Sandhurst - nowadays the best programmers in many banks are in trading rather than the IT department.

That means, of course, that since we aren’t so different from mundane workers; our pay is nearer to the average than it once was. So in 30 years we’ve gone from 300 per cent of average pay in 1984, to 110 per cent more in 1998, to about 35 per cent more in 2012. Anyone want to extrapolate that curve for me? ®