„Moore’s Law versus Murphy’s Law: Algorithmic Trading and Its Discontents“ Kirilenko, Andrei A. and Andrew W. Lo Flashcards
What is the main idea?
Review of the emergence of Algorithmic trading and the associated risks
What do Moore’s Law and Murphy’s Law state?
Moore’s Law – Computational progress (computers) will become faster and more efficient over time.
Murphy Law – Whatever can go wrong will go wrong; whatever can go wrong will go wrong faster and bigger when computers are involved.
What is Algorithmic Trading (AT)?
Computers trade (use of mathematical models, computers to automate the buying and selling of financial securities).
Improves efficiency by lowering costs, reducing human error, increasing productivity.
What are the benefits of AT?
◦ Lower costs / Scalability
◦ Reducing human error
◦ Increasing productivity
Which 3 developments have started the rise of AT? (over the last 2 decades)
o Financial system is becoming more complex over time
o Breakthroughs in quantitative modelling of financial markets (formulas)
o Breakthroughs in computer technology
Which 5 major developments that have fuelled the popularity of Algorithmic Trading?
- Quantitative Finance (Portfolio optimization, CAPM, Black & Scholes)
- Index Funds (Passive investing)
- Arbitrage Trading (uses algorithms to identity arb. opportunities, “Statistical arbitrage strategies” becoming more popular)
- Automated Execution and Market Making
- High-Frequency Trading (Automated trading of millions of transactions per day, very profitable, minimal risk)
What is the recipe for an index fund?
Choose securities, weigh by market cap, add or subtract securities when needed; afterwards weights are self-adjusting
How has the meaning of passive investing changed?
Before, most investors and managers equated “passive” investing with low-cost, static, value-weighted portfolios.
With the many technological innovations, it now refers to trading on a well-defined and transparent algorithm, which could require active trading → decoupling of active trading and active investing; demand for algorithms that execute passive-active strategies.
What are the benefits of arbitrage strategy to the market?
Arbitrage strategies improve:
Liquidity (arbitragers increase trading activity -> bigger liquidity)
Informational efficiency in the stock price (arbitragers adjust the mispricing, all info available determines the price)
What is Automated Execution and Market Making?
“Execution strategy” – how to break up a large trade into smaller ones over time to reduce cost (demand and price increase when you buy a big chunk at once)
Can be automated by computers
Market making – intermediary participates in buying/selling securities to smooth out temporary imbalances in supply & demand
Market makers make profit from the spread that they ask as a commission
Autoquoting – automatically giving better trades for larger trade sizes (like bulk discounts)
Name the 5 major incidents when AT went wrong.
August 2007: Arbitrage Gone Wild
May 6, 2010: The Perfect Financial Storm – Flash Crash
March and May 2012: Pricing Initial Public Offerings in the Digital Age
August 2012: Trading Errors at the Speed of Light
September 2012: High-Frequency Manipulation
What went wrong in August 2007: Arbitrage Gone Wild?
Large hedge fund companies lost a whole lotta money (Mostly statistical arbitrage funds).
Unwind Hypotheses: Forced liquidation of 1 or more large equity market-neutral portfolios (to raise cash/reduce leverage) -> Reduction in prices -> Similar portfolios experienced losses -> They also liquidated/deleveraged their positions -> …
What went wrong in May 6, 2010: The Perfect Financial Storm – Flash Crash?
33 minutes of crazy volatility and high volume.
Apple traded at 100 000$ per share.
Reason:
Automated execution algorithm on autopilot;
A game of “hot potato” among high-frequency traders;
Cross-market arbitrage trading, and a practice by market makers to keep placeholder bid-offer “stub quotes”.
What went wrong in March and May 2012: Pricing Initial Public Offerings in the Digital Age?
NASDAQ’s glitch delayed Facebook’s IPO 30 min (too much traffic). It was an infinite loop of recalculating the price.
In March 2012 BATS announced their IPO on their own exchange website, but it failed due to a software bug.
Both the Facebook glitch and the BATS fiasco can be explained as regrettable software errors that extensive testing failed to catch.
What went wrong in August 2012: Trading Errors at the Speed of Light?
Knight Capital Group started sending incorrect/random stock orders to NY Stock Exchange due to a software glitch.
Trades went through, and they were forced to sell at a half a billion loss.
Internalization - broker-dealers like Knight are permitted to post prices that are fractions of a penny better than prevailing quotes (can use 100.011 instead of 100.01 bid and 100.019 instead of 100.02 for ask). The dealer pockets the penny difference, and buyer gets price-preference in the queue.
What went wrong in September 2012: High-Frequency Manipulation?
Electronic broker-dealer had been involved in manipulative trading activities through offshore high-frequency trading accounts.
Spoofing – manipulating prices by placing an order to buy/sell a security and then canceling it, at which point the spoofer makes a trade in the opposite direction of canceled order
Layering – Placing a sequence of limit orders at different prices, creating artificial demand to manipulate the price.
All of this can happen in under a second.
What are the regulatory struggles?
*Soft- & hardware that are controlling financial markets too complex (can’t figure out all possible interactions that could occur among various components of the financial system)
*Differences in possible market structures create a trade-off between:
1) Costs to different intermediaries for maintaining a continuous presence in a market
2) Benefits to different market participants for being able to execute trades quickly (quick intermediaries = more costly)
*Technological advantages have reduced costs to intermediation, but not increased benefits of being quick
What is the Financial Regulation 2.0?
To bring the current financial regulatory framework into the Digital Age, the authors propose four basic design principles of the “Financial Regulation 2.0”:
o Systems-engineered – should approach automated markets as complex systems of software and human personnel interacting
o Safeguard-heavy – Safeguards must exist on multiple levels of the system, cannot rely on humans for risk management
o Transparency-rich – Must have more transparency of financial products and services, surveillance must be cyber-centric instead of human-centric
o Platform-neutral – Regulation should encourage innovation and be neutral in terms of how specific soft/hardware works