Hey, Michael Lewis: Stop DEMONISING Wall Street’s SUPERHUMAN high-speed trading
HFT's NOT the free-market crusher new book says it is
Yesterday's energetic debate on CNBC between BATS Global Markets president William O'Brien and Flash Boys author Michael Lewis and IEX's Brad Katsuyama put the cat among the pigeons over high-frequency trading.
It was all provoked by Moneyball writer Lewis' new book, Flash Boys, which, among other things, makes the claim that the American stock markets are "rigged" as a result of this practice.
This could be true of course: every participant in every market is always trying to rig it to their own advantage. But the important question is not whether people are trying to do so it's whether they are succeeding in doing so – and even if they are could the solution be worse than the problem?
High frequency trading (HFT) basically is just what it says on the tin. It's buying and selling stuff really quickly: even just threatening to do so is used as a technique. At its heart it's a very old technology, it's really just time arbitrage, which has been around for ages.
There was a time, many moons ago, when profit could be made by observing the prices on the London and NY stock exchanges for shares that were listed on both. If the prices diverged (or the exchange rate did) then small margins could be made by buying on one market and selling on the other at the same time.
And it is exactly that buying and selling that moves prices back into equilibrium.
The people who could manage to profit from the London/NY trading differential were those who could get information a little bit faster than everyone else and a better comms system: perhaps privileged access to the telegraph, or later, dedicated phone lines where they didn't have to pay the usual extortionate per minute fees for a transatlantic phone call.
All that HFT does is take this same basic idea and upgrading the technology involved. Stick the process on computers and let them get on with it. Write algorithms that look at past correlations between the co-movements in prices to certain pieces of news and then trade on that linkage on similar future movements in said prices.
Say, for example, that the price of sugar changes: we've seen what happens to the price of Coca Cola stock when this happens. So, get the algo to look at sugar prices and when they twitch, buy or sell Coke appropriately.
Yes, of course, it gets more complicated than this. The algos are working rather more quickly than human traders are. The market is working fast enough that simple latency in communication, light speed itself, becomes a competitive edge here. Traders pay exchanges vast sums to have their servers co-located with the exchanges' own in order to be those milliseconds (and for some, nanoseconds) ahead of the others.
Prices? Do flock off
All of this has some interesting effects. The first of which is just what we would expect from large amounts of time arbitrage going on: prices start to “flock like birds”, in the words of one physicist.
One of the advantages of computers and algos is that they can calculate across those pricing correlations rather faster than we meatsacks can. So not only can we run with the idea that a change in the price of sugar is going to change the price of Coke, but that also of Pepsi, and of the firm that makes Polos and so on, but the relative price changes of the Polo maker and Pepsi and on through the entire marketplace. This isn't quite there yet of course, but there's a very fierce evolutionary race underway. The effective lifespan of an algo is now estimated at a few weeks for example, perhaps a couple of months, before it is out-evolved.
Being able to cross-calculate these price changes also takes us a little closer to being able to plan the economy. This is the “Socialist Calculation Problem” writ small in fact: if you're going to try to plan an economy then you've got to have a method of being able to calculate the interactions in that economy.
You need to be able to make connections between a change in demand, supply or price of one item and the ensuant ripples through the economy. Hayek pointed out that only the market itself can do that, everything's just too complex to handle it any other way. But building these algos that are doing just that is the first baby step along the way to being able to calculate more directly.
There are still a couple of centuries of Moore's Law necessary before we can actually do it in full.
We've perhaps 1 billion items for sale in London at any one time. There's 63 million odd people in the country. We don't actually know the utility function of each person so what are we trying to optimise? And we've got to add in geography. A balti in the East End isn't the same good as one in Glasgow. We're not going to have the computing power to solve those equations for some time yet, but this is the start of the process.
Sponsored: Hyper-scale data management