Bubbles are hard to identify and even harder to time, but they aren’t hard to explain. A new technology or some other catalyst raises investors’ expectations, prices rise, and speculators pile in believing that they can resell the asset to someone else for even more money later. The key is that a bubble can form even if lots of investors are rational. So long as there are enough investors trading on momentum — or blind optimism — sophisticated traders will find it profitable to ride the bubble rather than to bet against it. And while bubbles pop, only some set off wider economic and financial crashes like the one the world endured in 2008. That account comes from A Crash Course on Crises: Macroeconomic Concepts for Run-ups, Collapses, and Recoveries (Princeton University Press, 2023) by Markus Brunnermeier of Princeton and Ricardo Reis of the London School of Economics. At an admirable 111 pages, it’s a great resource — and one I’ve gone back to as talk of an AI bubble has returned in recent weeks. But Crash Course is about “macro-financial crises,” not bubbles per se. That makes it helpful not just for thinking through whether there’s an AI bubble, but for imagining what an AI crash would look like. Crash Course sketches three models of the run-up to a macro-financial crisis, and each one suggests things to watch for with AI. The first is the classic speculative bubble already described, and in the model it’s not enough for everyone to be overly bullish about AI. The real bubble dynamic comes from the combination of naive investors extrapolating — line goes up — combined with investors who know valuations are out of whack but hang on for fear of selling too early. This is what a lot of the AI bubble worry has been about so far: As my colleague Edward Harrison wrote of the AI-heavy S&P 500 in August, “We’re in the unusual situation where fund managers almost uniformly say US stocks are overvalued, yet everyone is piling in.” That sounds bubblicious, and it’s the dynamic that “gets the party going,” says Reis. But it's not enough on its own to create the sort of crash that he and Brunnermeier write about. The second model in the book involves misallocation — investors mistakenly backing the wrong firms in the hot sector because some policy or distortion biases what gets funded. The third concerns the role of shadow banks that are vulnerable to runs and amplify dips in asset prices. Applied to AI, Crash Course’s brief tour of crisis economics suggests three gauges of the risk: -
Are investors buying or lending to AI firms without any conviction except the belief that they can unload later to the ‘greater fool’? -
Are there barriers — like, say, VCs investing in anything labeled AI — that are preventing money from reaching the most promising firms? -
How much of the AI boom is funded by debt and and are those lenders vulnerable to runs? ‘Yes’ to the first one would signal a bubble. ‘Yes’ to all three would warn of a crash. — Walter Frick, Bloomberg Weekend |