Feeds

Chasing the holy grail: the algorithmic arms race

The finance industry's secret weapon

Secure remote control for conventional and virtual desktops

Column Investment banks and major brokerage firms may or may not be leading the innovation in algorithms for securities trading purposes, but they take the lead in publicising their innovations in this field.

With trashy names reminiscent of graphic comics such as Dagger, Nighthawk, Cobra, Razor and so on, they are rolling these innovations out to their client base. Deployment of successful algorithms in electronic trading is perceived as an important service to generate deal flow from their trading clients.

Equally, it is yet another component of prime brokerage services, through which major investment banks and brokerage firms generate deal flow from their hedge fund clients. It should be remembered that Exchanges, ECNs and Multi-lateral Trading Facilities (MTFs) provide algorithmic trading capabilities integrated within their order management systems.

These institutions make substantial investment in the quantitative and mathematical skills to create the algorithms as well as the programming skills to provide commercial applications and integrate them into order management and execution systems. More importantly, these firms are responsible for the gradual adoption of algorithmic trading technology by the buy-side firms. With the exception of some highly specialised funds with the research and development resources and capabilities, it is highly unlikely that the buy-side would have adopted algorithmic trading technology.

A number of investment banks and brokerage houses claim to be leaders in the field of algorithmic trading capabilities. The criteria for claiming success for an algorithm are somewhat self-selecting. They include investment and trading performance, ease of use, accessibility, breadth of use across instruments and asset classes.

How durable are the algorithmic models? Are they ephemeral? Does wide scale adoption devalue their value? Do the innovators keep the “best inventions” to themselves, deploying them strictly on their proprietary desks and/or for favoured clients?

Conscious and concerted development of algorithmic trading models is about five years old, though the activity has been undertaken since the early 1990's but without the technology to give full rein to their commercial scope and application. As may be expected, comparative measurement of the durable success of algorithmic trading models in different market and trading environments is elusive. If it has been conducted it has been conducted in private. Some organisations claim success of their models over a period of time. It may be logically conjectured that “unsuccessful” algorithmic models are quietly dropped while successful models and maintained and, where appropriate, enhanced. Rather like Official Secrets, it is unlikely that any scientific research on the matter will be published at least until it is merely a matter of historical record.

What can be said is that unsuccessful models have not been so unsuccessful that they have had to be disclosed as publicly attributable to substantial trading losses.

Does wide scale adoption devalue their value? Do the innovators keep the "best inventions" to themselves, deploying them strictly on their proprietary desks and/or for favoured clients?

The “Algorithmic Arms Race” has now reached a stage where there are two distinct business models that drive the development of algorithmic trading capabilities. The first is driven by the quest for cost reduction, operational and trading efficiencies. These are themselves influenced by the competition between exchanges, ECNs and MTFs, investment banks and brokerage organisations and as they enhance the efficiencies and reduce the cost of access to their electronic trading services. This is the publicised side of the Algorithmic Arms Race. Institutions will publicise the service element of this model, though publicising the "trading" or "investment" performance will remain elusive.

The second is "the original model": development of complex and sophisticated trading technology which encompasses a greater range of events and criteria on which to base trading strategies; not only short term but taking a longer term perspective on issues which influence markets and economies. The former is likely to encompass the majority of financial institutions. The latter will be confined to specialist funds and elements of the proprietary trading desks of major global investment banks. This remains the secret side of the Arms Race.

Copyright © 2007, IT-Analysis.com

The essential guide to IT transformation

More from The Register

next story
Microsoft boots 1,500 dodgy apps from the Windows Store
DEVELOPERS! DEVELOPERS! DEVELOPERS! Naughty, misleading developers!
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
Scratched PC-dispatch patch patched, hatched in batch rematch
Windows security update fixed after triggering blue screens (and screams) of death
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.