Original URL: https://www.theregister.co.uk/2013/06/21/hft_financial_trading_rise_of_the_machines/

Rise of the Machines: How computers took over the stock market

Inside the super-secret world of algorithmic trading

By Jack Clark in San Francisco

Posted in Servers, 21st June 2013 10:19 GMT

Feature Trading used to be limited by how fast one human could shout at another and agree upon a price. Now it's limited by the speed of an electron through copper wire. This has caused, to put it mildly, some changes.

In April 2013 bombs went off at the White House and Barack Obama was injured, the Associated Press reported. The news sent markets lurching downward with the Global Dow index losing 150 points (thereby wiping out $136bn in market value) before the AP announced that its Twitter account had been hacked - no bombs had gone off, Obama was fine. But that didn't stop robotic trading algorithms from quickly selling off their holdings within seconds of the news.

This was not the first time automated electronic trading platforms – "high-frequency traders" – led downward lurches in the market. In 2010 the Dow Jones Industrial Average plummeted by over 1,000 points with the stocks of some major companies falling by over 90 per cent, a market disturbance so potent that it came to be known as "the flash crash". Various shares' values were wiped out. Billions were lost. And it all happened in less than an hour.

These are not isolated incidents: Nanex, a firm that monitors high-frequency trading within the stock markets, regularly tracks specific shares whose prices are being influenced by trading algorithms battling with each other. Days after the AP tweet fiasco, shares of anti-virus specialist Symantec fell 10 per cent in a matter of minutes after a trader put in a very large sell order with no sell price limit – a market signal that caused watching robo-traders to initiate their own sell orders, and drive the price down.

So just how much of the market is made up of these computer trading engines? Around 80 per cent of US stocks are traded this way, according to a 2012 analysis by Morgan Stanley beancounters, versus at least a third of UK stocks, according to a substantial 2012 report by the UK Department of Business Innovation and Skills' Foresight committee. This compares with an overall figure of 40 per cent for European markets, according to a 2010 study by Bank of England.

"These fractions have risen from single figures as recently as a few years ago. And they look set to continue to rise," the bank wrote (PDF).

The beginnings of HFT

High-frequency trading (HFT) has been around since the late '90s, when changes to trading regulations meant electronic communications networks (ECNs) could let traders reach into markets and start making bids. The first US ECN was INSTINET, that was closely followed by the ISLAND ECN in the late-'90s. Others proliferated.

"All these things started to talk to one another. Networks got faster, communications got better, data centres got better, and it went from there," says Scott Ignall, chief technology officer of financial trading firm Lightspeed.

Alongside the proliferation of ECNs, regulatory changes in the US altered how trading worked. What started HFT in a serious way was when the SEC introduced Regulation ATS (Alternative Trading System) in 1998, which allowed people other than stock exchanges to conduct electronic trading, and the Regulation NMS (National Market System) in 2005, which made it possible for multiple exchanges to distribute quotes for stocks. ATS made HFT possible, and NMS led to a technological arms race that reduced the number of firms in the market by dramatically increasing the investment needed to compete.

In response to the initial ATS regulation, ISLAND began offering a service in the late '90s whereby traders could stick their trading gear right next to the matching engine used by the NASDAQ market. "What that meant was they can read the data and place the trade in a sub-millisecond fashion," Ignall explains. "The ISLAND tech was fast enough to handle all the throughput, [HFT] snowballed from there."

These days all major exchanges offer such services, and have seen great demand for it: the London Stock Exchange recently expanded the slots available for robo-traders keen to get a direct feed to its market data, via its Exchange Hosting service – "the ultimate in low latency connectivity".

'Off to the races'

Other exchanges began offering similar services as well. The time it took to execute trades fell as firms sought to gain an advantage over each other. "We were off to the races – data flying everywhere, trade flying everywhere," Ignall says.

Things kept on growing until about 2006 or 2007, when NMS came in. "If you want to point into any specific point in the regulatory landscape that caused the huge explosion in HFT trading, it was Reg NMS," says HFT critic Joe Saluzzi, co-founder of the firm Themis Trading.

Reg NMS's introduction would make the business of making money harder for HFT firms by increasing the technical requirements of trading engines. But it also gave them an advantage in that they could start trying to exploit minute differences between the different prices at difference exchanges in the market.

"Some exchanges will process [data] faster than others," Eric Hunsader, the chief of market analysis firm Nanex, says. "When they do that, orders may appear on a network somewhere before it has even been processed somewhere in other exchanges. When that occurs HFT firms can adjust orders in other exchanges before trading takes place."

Since NMS's introduction, the industry has been dogged by problems: the flash crash, regulation, falls in the stock market, frequent blips in share and market prices due to HFT's capitalising on information faster than any trading engine ever before, and so on.

"We still have a lot of trading compared to what we had ten years ago, but it's about half, at least in the US equity markets, compared to what it was 3/4/5 years ago," Ignall says.

Though growth in trade volumes has tapered off, the amount of quotes – requests by HFTs to exchanges to get the price of a stock – has rocketed. This is because as more and more capable trading systems have come to dominate the market, they have been fighting tooth and nail to try and disrupt the flow of stock information to other rival algorithms, Hunsader says.

This means that the trading engines can exploit slight pricing discrepancies for a lucrative payoff. "It's gauging the reaction of the marketplace, it's the ability to gain information," Hunsader says.

Quoting more than ordering can sometimes give trading engines an advantage, and also provides them with the information they need to accurately make trades.

"We find that whenever there is higher message traffic, smart order routers will get behind," Hunsader says. "They'll route to places they won't get otherwise. It gives them [HFT traders] an advantage for discovering a smart order that's being split up on exchanges."

But to trade in this tangled web of markets requires a punishing level of technical sophistication and infrastructure that weeds out other firms. Just as in other businesses where technology is a profit centre (cloud computing infrastructure, semiconductor manufacturing), HFT companies are caught in a dilemma where they need to invest ever more to keep up with their competitors, and the cost for falling behind technically only magnifies over time.

Technology ingredients

"Other Wall Street businesses are about low-frequency analysis or customer service or having a particular network of people that you know," says Charles Jones, a professor at Columbia Business School. "HFT is really about the IT."

In the late 1990s, firms would use the best hardware money could buy, but eventually they started to outrun the capabilities of these platforms. Today, HFT firms may pay less for base hardware, but will spend a ton on the expertise required to write its software (and the pricy colocation fees with the exchanges).

"In '93 or '92 we were one of the first one firms in Europe having an electronic trading machine on the Deutsche Borse - back then we spent half a million euros for a computer. For half a million euros in hardware you can buy a mighty big machine these days. Even FPGA - they cost you 10,15,20,000 euros max for the card - the rest is intellectual property. The hardware buy itself has come down hugely in price," says Peter van Kleef, manager at Lakeview Capital Management Service.

As things got faster, traders moved from Windows to Linux, then they started doing kernel hacks within Linux to cut the footprint of the market data processing, and that worked for a while, and then even that got too much.

"Soon some traders said 'Well heck, why am I even putting this on the OS, why not just do it all on the card?'," Lightspeed's CTO Ignall says. This sentiment bought around the use of application-specific integrated circuits (ASICs) and then field programmable gate arrays (FPGAs).

"Every time you do this you become more specialised [and] your product becomes more inflexible," Ignall says.

But specialisation can help traders shave milliseconds off of trade times and maximise their advantage. This is hugely important when your trading engine is competing with others in a stock exchange. The LSE, for example, boasts about the 125-microsecond latency on round-trip trades through its platform. Any time spent not hammering the exchange at 125 microsecond increments with quotes or trades represents potential lost revenue.

"To use very loose terms, being able to process [market data] quickly requires that that data is stored in the appropriate parts of the machine that can be quickly accessed," high-frequency trading firm Tradeworx's chief strategy officer Mani Mahjouri says. "The problem is your standard production-level server isn't really designed for this type of application.

"The design choices made in chips today and architecture of how these systems are laid out aren't optimised for this type of behaviour," he adds. "Those limitations can be overcome using customised products and customised products make sense in that scope."

This is where FPGAs shine, as they let you create a custom bit of logic to run your trading algo. "The primary benefit of FPGAs and things like that is throughput into the system - being able to read and digest the market data that comes with these exchanges," Mahjouri says.

This lets you remove much more latency, but is also expensive,as it requires you to hire people with major hardware expertise along with people versed in the esoteric world of programming computer logic.

"The best algorithm is dead simple, otherwise you can't really run it on an FPGA," says Lakeview Capital's Kleef.

We also understand that some firms use other non-standard chips along with FPGAs, like RISC-based chips from Tilera. Like FPGAs, Tilera's chips are being stuck into network cards and used to process and act on data.

"We can get about 1.7 microseconds of latency from wire to wire," Tilera's director of marketing, Bob Doud, says. "We believe they're using our card for more than just the NIC. They're probably using some intelligent methods of delivering PCI bus data to the host – you can do things like direct data placement, bypassing the kernel, writing directly to the cores."

As with cloud computing, the trading engines and algorithms used by HFT firms are considered key competitive technology, so getting these companies to open up is difficult. However, just like cloud computing, they may be be more similar than dissimilar, with vast quantities of money lying in the minor details from algo to algo.

"I think most of the algorithms are very similar - probably a couple of dozen strategies that are well established over the years," Lakeview's Kleef says.

HFT job adverts typically ask for C++ and/or Java skills, along with a familiarity with Unix systems. Adverts we've seen list skills as varied as Java (J2SE), multithreading, garbage collection, GC tuning, concurrency, and low-latency, along with the use of technologies like Spring, Hibernate, and Maven.

There is also great demand for "quants" (quantative analysts) – which are the people who write the algorithms that create the data mining models used in these fantastically fine-tuned trading engines.

"We personally target PHD level, post-doc level candidates in the hard sciences, like physics," high-frequency trading firm Tradeworx's chief strategy officer Mani Mahjouri says.

What all this technology allows firms to do is create trading platforms that let them execute buy and sell orders in as low a time as possible.

Blink and you'll miss it

"If you think about time in the securities industry, time is risk," explains Frank Hatheway, chief economist for NASDAQ OMX. The goal of HFT is to get between the major institution buyers and sellers to 'capture the penny'," he says.

"The point of this is there's very small amounts of money in an arbitrage project and you just do it again and again and again," Hatheway says. "Once you have software that's sophisticated enough to do this, the ability to scale it is so much greater in an electronic environment than it was with a clever human who could do one or two or three stocks, now you can deploy a clever algorithm across 100 stocks, 1,000 stocks, 2,000 stocks, and potentially deploy it globally."

In many ways HFT is not unlike the Wild West – if you've got the fastest reaction time, then you'll easily beat others. The firms that can react the most quickly to trades set the pace for the market and therefore can take the largest profits and minimise their losses.

"If you know you're the fastest one then you can have considerably less code," Nanex's Hunsader says. "You can get away with a lot if you're the fastest one there."

By example, if you're the first to get trade information, and have a trading engine that means you can also analyse and execute your trade quickly, then if the market turns on you it's possible to unload your shares at the highest possible price, and there's a chance that the slower robo-traders behind you will still buy at an inflated price from you for a few seconds until their computers recognise that the pricing change has occurred.

"A lot of algos will follow other algos," Hunsader says. "As long as you've got somebody following you you can dump off your position."

By dominating trading in single markets, high-frequency trading firms can minimise their risk and maximise their profits.

The huge technology demands, combined with the large amounts of trading capital needed to bankroll the vast pace of trading, mean that barriers to entry for upstart firms are very, very high.

"It's really an arms race in the sense you're going to want the best, the fastest," Ignall says. "No way around the fact you'll need to outspend the next guy. If you're trying to be the fastest you have to get the best hardware."

Lightspeed's point has been repeated by all of the sources we've spoken to.

"This whole thing is one giant arms race," Doud says. "It's the guy who has the best equipment."

After talking with many sources, we've heard there are probably around eight HFT firms operating in America that take the lion's share of the market. "Right now we're trading US equities – trading about one per cent of the equity market volume daily," Tradeworx says. The company also sells its tech to other firms as well, and this means its trading technology handles around 5 per cent of US equities alone. "Two hundred to three hundred million shares a day," Mahjouri estimates.

HFT criticisms

But for all its advanced technology, qualified staff, and direct link between developing clever tech and making tons and tons of money, high-frequency trading is prone to mistakes that wipe billions off the market.

Even as the machine trades, what the machine does is still created by a human, and as we humans haven't been able to program a full replacement of ourselves, any machine implementation must be a simplification of a human and if you want to make it fast it has to be hugely simplified - Lakeview Capital's Kleef

And with this simplification comes the weird flocking behaviour of HFTs, where once the algorithms notice that someone has started to unwind their position, they will all do so in turn, as happened with the flash crash.

This leads to a downward spiral where, very quickly, liquidity in the market can dry up at key times as all algorithms divest themselves of their positions and try and wait out the rough parts. A recent report found that in at least one case the top eight HFT's removed liquidity from the market 59 per cent of the time, so they actually prevent the market recovering as quickly as it would if there were institutional investors keeping an oar in during price storms.

"The market is not physics, it has no laws of physics. It's more human interaction, more game theory than physics," says Kleef.

Ultimately the problems of a market populated by an HFT are a reflection of problems that have dogged people for time immemorial – namely, getting along together in a sensible, reasonable way when every individual is encouraged to be as selfish as possible to get ahead.

We would note that game theory is about maximising the benefit of the individual operator in an aggressive kill-em-all market – HFT's are not built to help each other, but to drive each other into the ground. ®