Skip to content
Longterm Wiki
Back

Lawfare: Selling Spirals and AI Flash Crash

web

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: Lawfare

Relevant to AI safety discussions on systemic risk from correlated AI behavior; illustrates how near-term AI deployment in high-stakes domains like finance can produce dangerous emergent dynamics even without any single actor behaving maliciously.

Metadata

Importance: 52/100opinion pieceanalysis

Summary

Former SEC Chair Gary Gensler warns that AI-driven algorithmic trading systems, by converging on similar models and data sources, could trigger synchronized selling spirals and market-wide flash crashes. The article examines how correlated AI behavior in financial markets poses systemic risk, and explores regulatory and technical interventions to prevent such scenarios.

Key Points

  • AI models trained on similar data may produce correlated trading decisions, amplifying market volatility rather than dampening it.
  • High-speed, synchronized AI selling could trigger flash crashes far faster than human regulators or circuit breakers can respond.
  • Gensler highlights the challenge of regulating AI systems whose decision-making is opaque and difficult to audit after the fact.
  • Proposed mitigations include diversity requirements in model design, enhanced circuit breakers, and greater regulatory oversight of AI in finance.
  • This scenario illustrates broader AI safety concerns about emergent collective behavior when many agents use similar optimization strategies.

Review

The article examines the potential systemic risks posed by AI and algorithmic trading, highlighting SEC Chair Gary Gensler's prediction of a potential financial crisis triggered by AI models. The core concern is that a small number of similarly trained trading algorithms could amplify market downturns through rapid, synchronized selling, creating 'selling spirals' that could cause substantial economic damage. The piece explores various proposed mitigation strategies, ranging from SEC regulatory proposals to more technical interventions like changing trading order mechanisms. Notably, experts like Albert Kyle and Andrew Lo suggest innovative approaches such as constraining trade speeds and creating a centralized monitoring system analogous to a 'National Weather Service' for financial markets. The analysis is nuanced, acknowledging both the risks of AI-driven trading and potential counter-arguments, such as Tyler Cowen's perspective that increased AI model diversity might actually reduce crash risks.

Cached Content Preview

HTTP 200Fetched Apr 9, 202615 KB
Selling Spirals: Avoiding an AI Flash Crash | Lawfare
 


 



 
 

 
 
 
 
 
 
 
 
 

 
 
 
 Kevin Frazier
 
 


 
 
 
 
 
 @kevintfrazier 
 
 
 
 
 
 
 
 kevintfrazier.bsky.social 
 
 
 
 
 

 
 
 
 Meet The Authors 
 
 
 
 Subscribe to Lawfare 
 
 

 
 
 
 A year ago Gary Gensler, the chair of the U.S. Securities and Exchange Commission (SEC), made a dire prediction: Artificial intelligence (AI) would cause a financial crisis if regulators did not act soon. His warning has largely gone unheeded. Insufficient action—and widespread inaction—in response to Gensler’s warning suggests that his concerns have not been addressed. In fact, his fatal predictions have become even more probable. 

 With a new administration looming and AI still booming, now is the time to reexamine Gensler’s prediction, analyze proposed solutions, and map out specific steps forward. 

 The Prediction 

 Gensler anticipates that, when people reflect on a hypothetical, near-future economic crisis—perhaps one occurring as soon as the late 2020s—they “will say, ‘Aha! There was either one data aggregator or one model ... we’ve relied on.’” Put differently, broad use of a few models trained to respond in similar manners to similar situations will cause a slight downturn in the market to become a rapid collapse. 

 This prediction seems like a good bet—partially because it’s arguably already nearly happened. In 2010, a flash crash occurred in the market due to a number of high-frequency trading algorithms engaging in a series of rapid trades . Perhaps most troubling, this quick dive took place on an otherwise normal trading day . A few algorithms in use simply “misread” the market . The unwarranted sell-off initiated by those mistaken models then caused other programs to respond in kind. The $1 trillion lost in that half hour period was eventually made up thanks to human intervention . 

 But this was not a fringe case. A similar algorithmic hiccup also took place in 2016. In that case, analysts attributed an overnight 6 percent drop in the British pound to algorithmic trading. This incident confirmed the susceptibility of algorithms to “high-speed selling spirals.” 

 One such spiral took place even more recently. Earlier this year, a typo in Lyft’s earnings report (anticipating that a profitability metric would surge by 500 basis points rather than the actual 50) resulted in trading algorithms rushing to buy the stock. In just a few hours of after-hours trading, Lyft was up 60 percent. Only when humans realized the mistake did the stock come back to reality.

 Since then the use of trading algorithms has only increased—as has deference to their alleged sophistication. Algorithmic trading tools complete as much as 75 percent of all trades in some markets. The biggest trading companies are racing to adopt si

... (truncated, 15 KB total)
Resource ID: 98ab26437f379f73 | Stable ID: sid_WikF3byakc