EVAN DAVIS- DECEMBER 2, 2022

EDITOR: DENYSE CHAN

What ended the Great Depression?

If you answered World War II, you’d be in agreement with most. Even the U.S. government officially endorses this stance, claiming that the wartime economic mobilization “cured” the Depression. This is because mainstream economic theory derives heavy influence from an economist named John Maynard Keynes. In his magnum opus “The General Theory of Employment, Interest, and Money,” Keynes argued that government intervention in the economy in the form of stimulus spending (including wartime spending) is the solution to economic recession.

Keynesian economists don’t argue that World War II ended the Great Depression without reason. Keynesian economics places heavy importance on a concept known as aggregate demand. In essence, aggregate demand is a measure of how much money an economy is spending at any given point. Gross Domestic Product (GDP) is the indicator used to determine aggregate demand, calculated by adding consumption and investment spending by individuals and businesses to the country’s government spending, and subtracting the trade deficit. 

Keynesians assert that the higher aggregate demand is, the more money is changing hands. When money is spent, whoever is paid in turn has money to spend, and the spending continues: one man’s spending generates another man’s income. With more spending, more economic activity takes place. The effect compounds (or “multiplies”), resulting in what is known as the Keynesian money multiplier effect. Resources are mobilized, employment rises, and the economy flourishes. Government spending can potentially boost GDP by much more than the initial monetary amount. As such, Keynesians believe that in order to mobilize the economy during a recession, the government should spend money, either by pumping it directly into the economy or through giving consumers and businesses more income to spend. 

The problem with the Keynesian story is that economics isn’t that simple. The economy is made up of individuals trading money, goods, and services. Individuals are self-interested. Consequently, any given trade in a capitalist market is a result of two people acting in a way they believe will improve their own conditions and satiate their desires. However, blind spending is not inherently good; where the spending is directed matters.

Then, what was the relationship between the Great Depression, its recovery, and World War II? 

First, it’s important to review how exactly the Great Depression started. When speculating about the causes, people tend to point out symptoms such as declining consumer confidence, stock markets plunging, and bank panics and failures. However, these are just that: symptoms. It would be like blaming a plane crash on the plane falling out of the sky. To actually investigate the causes, it’s prudent to recognize the fact that the Great Depression was an economic “bust”, or decline in economic activity, preceded by a “boom”, a strong increase in economic activity.

Malinvestment, or economic investment in an unsustainable fashion, is often agreed upon to be the cause of the Great Depression. Malinvestment is usually caused by government intervention in the economy to unnaturally stimulate investment. Interest rates in equilibrium are generally determined by the intersection point of demand for investment and supply of savings in an economy. However, when the central bank engages in credit expansion, interest rates drop and the supply of loanable funds increases, making investment easier without a corresponding increase in savings. Indeed, this is what happened. The money supply increased by about 28 billion dollars during the 1920s. Economists Barry Eichengreen and Kris Mitchener in a paper conclude that “the credit boom view provides a useful perspective on [The Great Depression].” 

Monetary expansion, which creates investment, typically results in that investment being unsustainable, given the lack of savings to back it up. This eventually results in an economic crash. In 1929, lack of stability was revealed within the grain industry, where massive investment had already taken place. After European competition lowered the profitability of wheat in the US, stock prices crashed. The economic effect compounded, bank panics took place, and “a large-scale loss of confidence led to a sudden reduction in consumption and investment spending.”

This illustrates how recessions start and end. The culprit is malinvestment; overconfidence, often driven by loose central bank monetary policy, allows unsustainable investment into certain economic projects. When monetary policy is tightened or some other event shatters confidence, the crash results.

Then, what actually ended the Great Depression? 

Economist Robert Higgs challenges the view that wartime expenditure was responsible. Instead, he argues that the economy was already in recovery from the Great Depression’s malinvestment before the war. Wartime spending disrupted this recovery, to the detriment of the average consumer. However, high expectations for the economy after the war restarted genuinely productive economic activity, allowing swift and true economic recovery.

This is not what Keynesians thought as World War II drew to a close. Many forecasted that the end of wartime spending would result in mass unemployment and economic catastrophe. Indeed, GDP fell… in terms of government spending. However, private economic activity was not deterred, and the predicted economic collapse never came about. The Great Depression of 1946 simply didn’t happen.

Wars don’t help economies. They destroy. People die, resources burn, and infrastructure topples. The economic activity created by wartime spending is wasteful. Rather than contributing to genuine economic growth in service of consumer demand, it simply shifts the economy in a random direction, using up resources in the process. 

John Maynard Keynes himself argues that any economic activity is better than none at all. He cited the Treasury burying money and letting workers dig it up as a worthy potential economic project in service of increasing employment and aggregate demand. But the fallacy in such thinking is obvious. If every worker in the United States were to be employed in the U.S. army as a foot soldier, full employment would be the result. However, no one would produce food or maintain infrastructure. Society would instantly collapse. The type of employment in an economy matters. Full employment should not be the goal; good employment should be the goal.

Unfortunately for our economy, disciples of Keynes habitually take this hyperfixation on aggregate demand to heart, and not just when it comes to World War II. Take Paul Krugman. After the deadly attacks on the World Trade Center in 2001, Krugman was quick to respond. Within three days, he had an article published in the New York Times claiming that “Ghastly as it may seem to say this, the terror attack – like [World War II], which brought an end to the Great Depression – could even do some economic good.”

His argument was simple. The economy was coming off the back of a technology-based recession (note that Krugman explicitly predicted the recession wouldn’t happen back in 1998). Economic activity and business investment were low. Then, the attack took place. In his own words, “all of the sudden, we need some new office buildings.” Voila, money is spent in the process of rebuilding, and as a result of the Keynesian multiplier, the economy improves.

The fallacy in the argument is clear: ignorance of opportunity cost. According to Keynesians, idle economic capacity is bad, and any mobilization of these resources and labor is good. In its most basic form, Krugman’s Keynesian argument is a blatant case of the broken window fallacy. Krugman assumes that the resources and labor used to recover from the attack would otherwise be sitting around uselessly. However, the resources used to repair the tremendous damage from the attacks could’ve been used elsewhere, on new buildings rather than rebuilding old ones. Even resources idle at the moment could be used in later projects. Hospital beds and resources could’ve been saved for people who already needed them, rather than filling them up with injured victims of the attacks. All of this extra economic activity has an opportunity cost: alternative uses they could’ve gone towards.

Unfortunately for Krugman, the Twin Towers didn’t stimulate the economy well enough. In 2002, he wrote another New York Times article where he argued for the creation of a housing bubble to offset the technology recession. Yes, you read that right. He’s referring to the one that popped in 2008. In order to get business investment going again, lower interest rates could stimulate mass investment into housing. According to Krugman, in order to encourage business investment, the Federal Reserve ought to lower interest rates to stimulate mass investment into housing. In his own words, “[Fed Chairman] Alan Greenspan needs to create a housing bubble to replace the Nasdaq bubble.”

The fallacy in his argument is once again clear. “Get resources moving; don’t worry about where they’re going. The economy can figure that out later.” But, is wasteful spending really an improvement over less spending? It clearly isn’t if the wasteful activity is unsustainable malinvestment into a sector of the economy. In the case of the housing bubble, that came in the form of incentivizing risky mortgage loans, and greater housing construction. The issue was that the ease of investing into the housing market was not based on genuine long-term demand for housing, but short-term confidence propped up by access to easy money.

Unfortunately for the average American, Greenspan did create the housing bubble. The result was the largest economic recession in American history at the time, other than the Great Depression itself. Yet, it wasn’t large enough for Krugman to get the memo. In 2011, he decided that in order to fix the result of reckless spending, the solution was more reckless spending, this time on an objectively useless economic project. That project was a fake alien invasion. 

In Krugman’s own words, “If we discovered that space aliens were planning to attack and we needed a massive buildup to counter the alien threat… this slump would be over in 18 months. And if we made a mistake and discovered that it was fake, we’d be a better [economy nonetheless]…” Nevermind the fact that the 2008 recession proved the unsustainability of just throwing money at some sector of the economy and stimulating mass investment into it. Two years later, in 2013, he doubled down on his earlier claim, saying “This is the kind of environment in which Keynes’s hypothetical policy of burying currency in coal mines and letting the private sector dig it up – or my version, which involves faking a threat from nonexistent space aliens – becomes a good thing… unproductive spending is still better than nothing.”

Once again, the fallacy in his argument is clear. Wasteful spending on a fake alien invasion would divert resources and investment to the fake alien invasion, where they could have otherwise been used for genuinely productive economic activity. As the U.S. economy clearly discovered in 2008, when confidence is broken (likely in this hypothetical case by the revelation that the alien invasion was not real), the consequences can be disastrous. Investment stops, but the resources have already been used up in the collapsing sector. A recession follows.

Thankfully, Congress did not initiate the spending on this fictitious crisis. Unfortunately, however, the thinking underlying such a policy recommendation is still popular among economists today. Opportunity cost and the sustainability of spending are overlooked in the quest to maximize aggregate demand. A massive housing bubble is not sustainable. Wasting resources on a fake alien invasion is not sustainable. When the propped-up demand for housing falls or the alien invasion is uncovered as fake, jobs created in the bubbles will be lost. Wasted resources will either have to be scrapped or redirected towards actual productive uses. As we saw in 2008, this process is a potentially massive economic recession.

The ultimate lesson to be derived from all of this is that allocation matters. War is neither sustainable nor productive. Wars do not create. They ravage and annihilate. Wartime spending creates wasteful economic activity in service of death, destruction, and desolation. Wasteful stimulus spending creates malinvestments which ultimately destabilize the economy. Markets, saving, entrepreneurship, and natural investment should drive economic activity, rather than the despotic monetary planning that is Keynesian economic policy.

Featured Image: Moyers

Disclaimer: The views published in this journal are those of the individual authors or speakers and do not necessarily reflect the position or policy of Berkeley Economic Review staff, the Undergraduate Economics Association, the UC Berkeley Economics Department and faculty,  or the University of California, Berkeley in general.

Share this article:

Leave a Reply

Your email address will not be published. Required fields are marked *