The IT kit revolution's OVER, say beancounters - but how do they know?

Come on, readers - fill us in

Abacus. Credit: Newsum Antiques

Worstall on Wednesday One of the great problems within economics is in trying to work out what's a structural change, what's a cyclical change and what's being buggered up just because you're not measuring it properly.

For example, we can look at how much companies (or more accurately, non-household entities) are spending on the computing infrastructure they use – investments in software and hardware. We'll stick with the US numbers, where it went from two per cent of GDP (ie, one-fiftieth of the entire economy that year) to nearly five per cent in 2000. It has now has fallen back to around 3.5 per cent.

This has some people, like Slate columnist David Autor, worrying that the white hot technology revolution that is computing is running out of steam.

The concern here is that we've done all the heavy investing in information processing that we're ever going to do. We've had a structural change: modern companies must be IT-based. Now that we've managed that, from here on in IT spending will become more of a maintenance item than a breacher of new frontiers. But this development is bad for the future. Why? We're pinning a lot of our hopes of being ever richer in the future on the idea that computing is going to create a goodly part of that wealth.

Less money to go around

We can also read the rise and fall of the IT budget in a second way, as a cyclical story. There was a vast amount of money wasted in dealing with Y2K, then we splurged way too much on dotcoms and it's hardly surprising that we're investing less in the wake of one of the largest recessions of modern times. In this model, it will all swing back upwards in the near future. Here we're less worried about that GDP growth of the future, instead blaming the current dearth on the vagaries of recent GDP movements.

The first scenario doesn't really sound all that likely: what can be done with computers (widely defined) is still roaring ahead and we can do things today that no one was even dreaming of 15 years ago. We really don't think that IT is standing still.

But the second model also seems a little unlikely because investment rates for companies, at least in infrastructure like IT, aren't really thought to be determined by GDP or the general economic situation but rather by interest rates and cash availability within a company. Companies are, in general, flush with cash, and money is, compared to recent times, damn near free for corporations to borrow.

It should be said that you can definitely get to either of those conclusions, as you wish, with a judicious reading of the statistics we have available. But neither of them quite taste right – which is another way of saying they don't really meet our instinctive understanding of the underlying theory.

You're all doing it wrong

At which point the Harvard Business Review gives us another possibility. We're simply measuring prices wrongly.

We're all aware that different things change prices differently: IT equipment prices for a given performance level fall through the floor over time, while the cost of a haircut (a fixed performance level) increases over time. What's perhaps less well known is that government statisticians desperately try to compensate for the improvement in tech capacity at the same price point. The year 2000's $1,000 laptop is obviously a very different beast to the year 2014's $1,000 laptop.

These adjustments are known as “hedonic” in the jargon, but there's quite a lot of people around thinking that the statisticians haven't got them quite right. When we correct for that quality at a price point issue, we can (as the HBR has) state that IT investment is rising over time; that fall is purely a measurement fault.

That's an argument that I have a certain sympathy for as I'm far from certain that those hedonic adjustments are being done right. I do think they are falling short of the performance increases.

Get involved - and remember your economics

However, I've a more than sneaking suspicion that we've actually got a different measurement problem here. And it's one that you guys can inform the economics types upon, if you should so wish. Remember your Hayek: all knowledge is local, so this is your chance to get what you know back to the centre.

When we measure things in the economy, we've obviously got a number of classes that we stuff spending into. Wages are different from profits, for example, and we've had all sorts of people insisting that the profit share of the UK economy has increased in recent decades.

It has, but only from around 1975 to 1980, the reason being that the 1975 profit share simply wasn't sustainable – not even enough to pay for machinery wearing out, or depreciation. It has been around and about the long term level ever since.

Yet it's also true that the wage share has been falling. Some look at the falling wage share and conclude that the profit share must be rising, but there's four parts to national income when we measure it this way: wages, profits, self-employed income, subsidies to consumption and taxes on consumption. The number of self-employed workers has risen, as has their aggregate income, and since the early 1970s we've imposed – and then raised a number of times – VAT. You know, another one of those taxes on consumption.

So, yes, the wage share has fallen but that doesn't mean that the inverse (the profit share) has risen as it's not actually the inverse. It's self-employed income plus the percentage of the economy that goes in VAT that has risen. (Nice charts of this can be seen here).

Measuring poverty's easy, right? Um, not quite

Take another example of being careful about what you measure. The US poverty rate was around 15 per cent or so in the 1970s and it's still around 15 per cent or so today. Thus all those trillions that have been spent upon poverty alleviation haven't alleviated any poverty, have they? But this is to miss something about the way that poverty is measured in the US. Their definition of poverty (and I'll pre-emptively caveat this by saying yes, it's more than this but this is enough for my point) measures poverty by market income plus the cash that government gives to poor people.

In the early 1970s that's what the US government did for poor people: gave them cash (Temporary Assistance for Needy Families, or sometimes just called “welfare” over there). So that early 1970s measure is really the number of people still below the poverty line after the government had helped them.

However, these days the US welfare system doesn't actually hand out much cash: rather, it gives people stuff and rebates through the tax system. These are programs like SNAP (aka “food stamps”), the EITC (working tax credits), Section 8 vouchers (housing benefit) and Medicaid (an NHS for the poor – and yes, it's about as good as you would expect an NHS which only the poor need access to). Hundreds of billions of dollars a year are spent on these things and US beancounters include absolutely none of it when working out what the poverty rate is. The current measure of US poverty is, therefore, largely a calculation of how many are poor before the government helps them.

Do also note that this means we cannot compare poverty rates across countries. Imagine how high the UK poverty rate would be if we didn't include tax credits, housing benefit and the like when we counted how many were still poor? To be clear on this, we do count them, as does every country except the US.

The point here is that exactly what you measure is important. And that brings us back to investment in IT. I have a feeling, and I hope you'll correct me or concur as appropriate, that no one is actually spending less on IT, quite possibly the opposite. Instead you're all doing that spending in a different manner.

Surely it can't be done like this … can it?

As Sharon T. Pokeworthy recently noted, “investment” is what the beancounters regard as “capital investment”, which then gets depreciated over time. Once upon a time, that was how you “invested” in IT. The £30,000 ADA compiler, the new computers in the basement, that eyewatering SAP installation. Big money upfront that would then be written off over the years: that's the number that everyone is worrying about at the top. Private fixed investment in information processing equipment and software as a percentage of GDP.

I'm just not convinced that that's the way it's done now. I'm sure (and again, please do concur or correct in the comments as appropriate) that a great deal more of what is still spending on IT – what is, in a colloquial sense, investment in IT, is actually over in the normal operating budgets of IT departments instead of in their capital budgets. You're all wrangling with much cheaper kit, yes, but you're adding more man hours to it. It all leads to a similar total spend, it's just that those man hours are not being recorded as a capital expense.

It's so long since I was involved in this stuff that to try to give examples might be embarrassing. But I can imagine that in cranking up the AWS account or altering MySQL to get it to do what you exactly want it to do, you spend the same amount of money on that as you might have done on Ingres and your own servers in the past. You get much better results today, of course, but for my argument, your time and that AWS bill are coming out of current operating budgets, whereas Ingres and the basement server would have come out of capital budgets.

And if that's true then we can go on to say two further things. The first is that if it is, then we've no bloody clue at all whether “investment” in computing is actually falling or not. All we know is that what used to be called “investment” is now over in the operating costs column – and we don't know the aggregate figure at all. The second is that knowing exactly what you're measuring and how you're measuring it is hugely important if you're going to start drawing conclusions from any of those metrics.

A Worstall Warning

Is it a structural change? Cyclical? Is it just moving our numbers from one definition to another without really changing the underlying spend?

I dunno. I hope you will tell me in this instance. But whatever you do, don't forget that this problem bedevils all economic statistics and so always, please, take them with the necessary shovelful of salt. ®

Sponsored: Minds Mastering Machines - Call for papers now open

Biting the hand that feeds IT © 1998–2018