In the wild, speed is the difference between killing or being killed – feeding or going hungry. Now more than ever, this is true in the shadowy concrete jungles of the world's financial markets. Those currently trading who aren't high-frequency traders, including traders on exchange floors, home day traders, or traders at other institutions, are giving their money away to high-frequency traders (who now make up about half of all market volume).
"By the time the ordinary investor sees a quote, it's like looking at a star that burned out 50,000 years ago," Sal Arnuk, co-author of a book critical of high-frequency trading entitled Broken Markets, told Wired.
On a basic level, high-frequency traders use a combination of hardware and software to see how much someone else is willing to buy or sell a given security for fractions of a second before their competition does. They can then trade accordingly. It's almost like being able to bet on a horse race from the future – you already know which nag has crossed the finish line first.
But the advantage never lasts. It takes ever increasing amounts of hardware sophistication, software sophistication, power, and money to stay ahead of the game and keep making millions from your rivals' comparative lack of information.
"There's a huge technology arms race to drive out latency," says Dave Lauer, a former high-frequency trader, "because when you're talking about latencies in nanoseconds and microseconds, a millisecond is an eternity."
Lauer arguably knows more than anyone about high-frequency trading (HFT) and the race to zero latency. He's worked as a hardware engineer building low latency equipment, as a quantitative research analyst for high-frequency traders, and as a contract worker in Goldman Sachs' tech group. He's written high-frequency trading algorithms and done mathematical modelling, and been a high-frequency trader himself on prop-trading desks at big banks such as Citadel in New York and Chicago.
Lauer quit HFT after trading for about two years. He then wrote an article about it that garnered enough attention to see him testifying before the US Senate and sitting on an SEC tech roundtable about high-frequency trading. So what's his view on HFT?
High-frequency trading only really entered the public consciousness in the midst of the 2009 financial crisis when the New York Times was one of the first to report on the subject. HFT actually started in 1999 when the SEC authorised electronic exchanges, although then it was more about software than hardware.
That said, the basis of HFT is still using proprietary computer algorithms to rapidly trade securities, moving into and out of positions in fractions of a second. In 2005 traders' systems generally ran on gigabit networking using stock networking equipment. Linux was frequently used although there was nothing really highly tuned to speak of, and the focus was on quantitative research. But when some traders started discovering the opportunity to profit from what is known as "latency arbitrage," or being able to profit from executing trades before others, they kicked off the mother of all races to reach zero latency.
When it comes to software and algorithms, the statistical arbitrage models used are pretty simple. "Because latency is so critical you can't do too much," explains Lauer. High-frequency traders take a massive amount of market data and generally use a parallel processing cluster to analyse it. Technologies such as Hadoop with a MapReduce structure are very popular in the industry.
High-frequency traders look for mathematical structure and perform time-series analysis, looking for something they can forecast. Several years ago, a very popular program looked at supply and demand imbalances in the order book and simply attempted to forecast what the next tick (upward or downward movement in the price of a security) would be. Thanks to the business model of the exchanges, all you needed to be able to do was break even on a trade and you could collect a rebate. Multiply that rebate by tens or hundreds of times a second – let alone doing better than breaking even on a trade – and you're talking real money, real fast.
High-frequency trading is at the forefront of hardware acceleration. This means using hardware-accelerated network stacks and NPUs (network processing units), or custom-designed programmable FPGAs. NASDAQ offers its ITCH data feed processed by an FPGA instead of a software stack.
"Some people have put market data processing into an FPGA," says Lauer, "some people have put trade logic into an FPGA, risk controls into an FPGA; some people are using GPUs to do massively parallel analysis on the fly. So there's some pretty interesting stuff going on in hardware acceleration that I think is more cutting edge than most industries."
There's actually been an interesting interplay between the video game and financial services industry in terms of GPU (graphics processing unit) technology – both are driving hard to increase GPU capabilities, albeit in different ways. Video gaming is driving rendering and shape processing; financial services are driving floating-point operations. There's also been a complementary feedback loop pushing GPUs to accelerate cores, add more cores, and to add more parallelisation. Lauer notes that "financial services, as a secondary industry to these [gaming] vendors, means that they can make more money, so they can put more money into research."
On a physical, financial exchange level, generally when you're talking about high-frequency trading, you're talking about high-end servers such as HPG8s (an HP ProLiant DL380p Gen8 server is pictured above) sitting in a rack, collocated at exchanges with a physical cross connect from the exchange into your rack.
With that physical cross connect you can "order from a menu," Lauer says. "If you want a gigabit Ethernet, it costs you X. If you want 10-gigabit Ethernet, it costs you Y. A lot of these venues now offer 10-gigabit Ethernet; it'll go directly into your 10-gigabit Arista Switch ($13,000, or £8,000), which is just a cut-through switch that can route that packet in nanoseconds into your server, which has a kernel bypass mechanism right into memory, and you're looking at it within a handful of microseconds."
It's important to understand that in high-frequency trading absolute speeds are essentially meaningless – relative speeds are everything. It doesn't really matter if you're 10 seconds faster or half a nanosecond faster, it only matters that you're faster or slower than whoever is fastest.
For example, when Lauer was trading in HFT in 2009-2010, the Spread Networks dark fibre line between Chicago and New York was the fast line: A super-low-latency fibre optic connection. (It was a few milliseconds faster than the normal line available to the public). But with fibre a beam of light bounces around as it travels through the cable. So now a company called McKay Brothers is building a dozen microwave relay stations (pictured below) between New York and Chicago. Microwaves don't suffer from fibre optic signal degradation and reflection, and are also in a straighter line than a cable because they don't have to go around things; they might even be less beholden to the curvature of the earth. "So you shave a couple [more] milliseconds from Chicago to New York," explains Lauer.
Microwave technology is pretty close to the speed of light. However, because microwave beams are susceptible to weather such as rain or fog, high-frequency traders are now looking ahead at an essentially weatherproof solution that provides much higher bandwidth: Lasers. And they're not the only ones – instead of using radio waves, a package NASA sent to the Moon a couple of weeks ago will be transmitting data back to Earth at 622Mbps via laser, much faster than any other deep space communication system.
Back on Earth, custom fibre and low latency network provider Anova Technologies claims its patented hybrid laser/millimetre wave system will deliver up to 10Gbs by 2015. Last April at the New York Battle of the Quants conference, one firm floated the idea of putting blimps or drones across the Atlantic from New York to London. "They estimate it would cost them $300 million [£185 million]," Lauer says, "and they think it would be worth it because they would get people to pay enough."
Step back from the gee-whizz factor of drones sending lasers across the ocean and it's hard to see how the technical advances of HFT offer any spill-over benefit to the rest of society. And at a fundamental level, giving some people access to market information ahead of others because they can pay more seems fundamentally corrupt: A racket, by any other name. Last month, when the Federal Reserve announced it would not be tapering its bond-buying program, several large orders were executed milliseconds before the information officially reached the Chicago exchange.
"I still think it's insane," says Lauer, who quit HFT because of his ethical concerns. "I think it's the craziest thing that massive amounts of capital – billions of dollars – are being put into shaving milliseconds off transmission times between sites. Find somebody who can explain the social utility that's being created by reducing New York/Chicago transit times by two milliseconds. Or New York to London. It doesn't exist. But there's still money to be made, and that's the fundamental economic disconnect in high-frequency trading, which leads me to classify it as a polluting enterprise, because if they're making money off this, there's a negative externality somewhere that is driving that money to them. It's a socialised cost."
On Wall Street, high-frequency traders are the hyenas of the wild, and they're using technology to eat everyone else's lunch.
Image Credits: Vincent Desjardins, Steve Ikam