Cold War II: The U.S. is losing its economic advantage in a new era of global conflict

Authored by tabletmag.com and submitted by Strongbow85

Can a second cold war be avoided? The brutal proxy war in Ukraine between Russia and the U.S. and its allies, combined with deepening trade and military rivalries between the U.S. and its allies and China, has made the question anachronistic. We are in Cold War II now.

A cold war is a conflict among rival powers that is waged by means short of direct combat between their forces and direct attacks on their homelands. A quick survey of the methods and means that characterize such a conflict shows they are already in effect. Proxy war? Check. Arms races? Check. Trade embargoes and financial sanctions? Check. Sabotage? Somebody blew up Russia’s Nord Stream pipelines in the Baltic Seas. Propaganda wars? Check.

Russia, China, and Iran claim they stand for a multipolar world no longer oppressed by U.S. imperial hegemony. Liberal internationalists in America and Europe claim that we are in a Manichean struggle between freedom and democracy on the one hand and reactionary autocracy, symbolized by the otherwise quite different Russian, Chinese, and Iranian regimes. In both camps, critics of regime policy are smeared as apologists for the enemy, regardless of whether that enemy is NATO or Putin’s Russia.

But while high-flown ideological battles between Putinism and Western liberalism have often taken center stage, the outcome of the new global conflict rests largely on economic competition. In the last cold war, the American economic system proved stronger and more robust than the Soviet-style command economy. Yet, in a strange turn, since the end of the last cold war America has rapidly pivoted away from the industrial model that proved so successful in the 20th century. The U.S. elite have wagered on transforming the country into an information and services-based economy. China, meanwhile, has adopted the older U.S. model of state-private sector cooperation, in the process becoming the world’s manufacturing base.

One side of the new cold war is led by countries like China and Russia, whose economic power rests on their control of physical goods. On the other side, the Western alliance dominates the financial and information sectors of the virtual economy. Neither side is completely independent of the other, but the division between them is unequal. Despite protests over Ukraine, Western Europe still needs to buy Russian gas to avoid freezing in the winter. The U.S. assails the Chinese government, but remains utterly reliant on China to supply Americans with critical goods like antibiotics.

As in Cold War I, the current conflict has produced a miscellaneous group of “nonaligned” nations that seek to keep their distance from both camps. Two of the world’s most populous democracies, India and Brazil, along with a majority of non-Western countries, pointedly refused to join North America and Western Europe in denouncing and sanctioning Russia for its invasion of Ukraine. And many nations, including Israel, prefer to be on good terms with both the U.S. and China.

Scholars still debate when the first Cold War started. Was it the Greek Civil War that began in 1946? The Berlin airlift in 1948? The Korean War that began in 1950? Or did the first Cold War begin even earlier, during World War II? In the same way, tomorrow’s historians can argue about when Cold War II started. The date that I nominate is 2008—when Vladimir Putin responded to the possibility of Georgian membership in NATO by invading Georgia, and when China demonstrated its anti-satellite capability to the U.S. by shooting down one of its own satellites—a metaphorical “shot across the bow.” Whatever its origins, in Cold War II the Ukraine war represents as dramatic an escalation of hostilities as the Korean War did in Cold War I. Whether the second cold war lasts nearly half a century, as the first did, remains to be seen.

Can a second cold war be avoided? The question is anachronistic. We are in Cold War II now. Share →︎ Twitter Facebook Email Print Link Copied link

In modern world wars and cold wars, manufacturing capacity both defines the great powers and shapes the outcomes of conflicts among them. In the pre-industrial era, except for a few trading posts that levied tolls on long-distance trade, the major power resources were populations of slave or serf or peasant farmers. Tribal warriors could gallop into a country, bully large numbers of farmers into paying them tribute, and replace the former landlords.

In the industrial era, however, no country can be a great military power without the ability to make most, if not necessarily all, of its own armaments and rely chiefly on its own population for soldiers and spies. The manufacturing industries on which modern military power depends are industries characterized by increasing returns to scale, from steel-making to automobile manufacturing to aerospace and computer production. Increasing-returns industries benefit from large markets, preferably large internal markets in populous countries. It is no coincidence that even in the alleged borderless global economy of today the global market share in these strategic industries tends to be dominated by firms based in the most populous developed nations—the U.S., Japan, Germany and—increasingly—China.

To be a great power or a superpower in the industrial age, then, it is necessary but not sufficient to have a large domestic population of consumers and workers, preferably sharing a common language and a sense of national identity; secure access to agriculture and mineral resources; and, if possible, the strategic depth enabled by a continental or subcontinental territory. That these are not enough in themselves is proven by the case of present-day India or Brazil which, according to an old joke, is the superpower of the future and always will be.

Given the importance of large home markets for the industries that are the basis of national military power, the scramble for empire among the industrializing nations of Europe as well as Japan in the second half of the 19th century and the first half of the 20th made strategic sense, even if it was immoral according to various perspectives. There were two ways to create a sufficiently large domestic consumer market and workforce capable of supporting the major industries needed for great-power status. One was to create a giant nation-state, with a single internal market and a culturally homogeneous, if not ethnically homogeneous, majority. This is what the U.S. did. The other strategy was to try to cobble together a multinational empire, as Japan did, or to try to hold together a multinational empire and modernize it, as the Soviet Union attempted.

One of the pleasures of watching James Bond movies from the 1960s is the premise that even then a tripolar world still existed, with suave and cunning Brits sharing global hegemony with uncouth Americans and boorish Russians. But after 1945, to the surprise of American and Soviet policymakers, Britain’s economic troubles, more than the loss of its colonial empire, led the U.K. to tumble down the great-power stairwell and become like France: a power of the second rank.

British thinkers had long speculated that the U.S., once it industrialized, would dwarf Britain, as Britain had earlier dwarfed the Netherlands. Proponents of “imperial federation” like Sir John Seeley argued for a customs union of Britain and its “white dominions” of Australia, Canada, and New Zealand. A common external tariff, of the kind championed by the turn-of-the-century British liberal politician Joseph Chamberlain, could have created a home market for this Greater Britain capable of rivaling those of the United States and Imperial Germany.

The idea was not crazy. In 2022 if the U.K., Canada, Australia, and New Zealand formed a single trading bloc, it would have a population of 138 million, comparable to Russia’s 146 million. Calculated according to purchasing power parity (PPP), Greater Britain would have a GDP of more than $7 trillion, greater than the roughly $5 trillion GDP of the Russian Federation, whether based on PPP or market exchange rate. To be sure, as the French economist Jacques Sapir has argued, GDP measurements probably exaggerate the financialized British economy and underestimate the Russian economy, with its real-world strengths in energy, minerals, and manufacturing.

But Greater Britain was not to be. In the late 19th century, Canada, Australia, and New Zealand insisted on protecting their own industries with their own national tariffs. And British financial interests, based in the City of London, defeated the British manufacturing interest and equated free trade with virtue and protectionism with sin—a process that was repeated later in the U.S. in the 1990s and 2000s, with similar disastrous results for U.S. industrial and military power. The result was that in spite of Britain’s success at innovation in industries ranging from television to atomic power to pharmaceuticals, the production and scaling-up happened elsewhere, in the U.S. or Europe or Asia. Britain became a military dependency of the U.S.—“Airstrip One,” as it is known in George Orwell’s 1984.

Two other great powers that suffered defeats in the world wars or the cold war of the 20th century were Japan and Russia. It is easy to forget that Japan is almost as populous as Russia (125 million people compared to 143 million people), although the Japanese population is crammed into a few islands instead of spread across 11 Eurasian time zones. Japan’s Self-Defense forces are one of the world’s largest militaries. But the militarists who led Japan miscalculated in believing they could conquer Asia in the face of American, British, and Soviet opposition. A dependency of the U.S. during Cold War I, Japan has emerged as an important ally of the U.S. against China in today’s Cold War II.

Russia’s power in the 20th century was weakened by its communist regime. An older generation of communist fellow travelers in the West used to claim that at least Stalin had to be given credit for his program of crash industrialization that helped the USSR survive the Nazi onslaught. Writing in 1931, Stalin was right about Russia’s relative backwardness:

One feature of the history of old Russia was the continual beatings she suffered for falling behind, for her backwardness. She was beaten by the Mongol Khans. She was beaten by the Turkish beys. She was beaten by the Swedish feudal lords. She was beaten by the Polish and Lithuanian gentry. She was beaten by the British and French capitalists. She was beaten by the Japanese barons. All beat her—for her backwardness: for military backwardness, for cultural backwardness, for political backwardness, for industrial backwardness, for agricultural backwardness. She was beaten because to do so was profitable and could be done with impunity ... We are fifty or a hundred years behind the advanced countries. We must make good this distance in ten years. Either we do it, or they crush us.

But Russia was rapidly industrializing under the late czarist regime. The communist revolution in 1917, followed by the Civil War, the exodus or imprisonment and execution of many talented Russians, the confiscation of all farmland and industry, the state-engineered Ukrainian famine, and the military and political purges of the 1930s, were all economic as well as social and moral disasters, undermining Soviet military strength.

That the problem with the Soviet economy was communism itself, not Russian culture or some other local factor, is clear from the history of Marxism-Leninist state socialism elsewhere. The Cold War provided unique experiments in the form of divided countries: Germany, Korea, and China. In each case, the noncommunist section quickly surpassed the communist-controlled territory in prosperity. Add to this the poverty and stagnation of Castro’s Cuba and the rapid growth of the Chinese economy after Mao’s successors abandoned Marxism-Leninism for Market Leninism, and similar growth in modern Vietnam, and the case against communist-style state socialism is closed.

The Cold War was viewed by many on both sides at the time as a contest between two ways to organize a modern industrial economy, “capitalism” and “communism” or “socialism.” Looking back, it is clear that instead Cold War I was a three-way contest between communism and two kinds of capitalism—liberal or free market capitalism, and developmentalism, characterized by a mixed economy and state-directed industrial policy in support of targeted strategic industries.

Examples of the developmental state are familiar in East Asia: Japan and the four Asian Tigers (Hong Kong, South Korea, Taiwan, and Singapore), plus post-Mao China. But the East Asians borrowed the developmentalist model from Germany and the United States, which in their successful attempts to catch up with industrial Britain in the 19th century had used their own variants of the tradition, associated with Friedrich List in Germany and Alexander Hamilton and Henry Clay in the U.S. The roots of developmentalist economics can be traced back to mercantilism and cameralism in early modern Europe and even further back to Renaissance Italy. (There was no “fascist model” of economics. Mussolini’s regime might be classified as an authoritarian developmentalist state, but the short-lived Nazi economy was based first on preparation for war and then on plunder and slavery.)

Ironically, during the Cold War, when the U.S. supposedly illustrated the virtues of free enterprise, the U.S. had its own successful developmental-state industrial policy, orchestrated by the Defense Department through the Defense Advanced Research Projects Agency (DARPA) and other agencies. In the 1990s, libertarians and neoliberals claimed that the information technology revolution proved the superiority of the free market to government when it comes to innovation. But the major tools of the computer age, from digitization to the global internet to the computer mouse were developed by government contractors reliant on U.S. taxpayer money.

It is no coincidence that U.S. productivity and innovation sputtered in this century, when neoliberal Democrats and libertarian Republicans decided to let the free market develop the next wave of technologies. It turns out that venture capitalists and advertisers are more interested in addictive online sites like Facebook and Twitter than in robots and cures for cancer. Without exception the major advances in basic technology during the post-1980s era of free market utopianism have been largely funded by the federal government. Think of the sequencing of the human genome, the vaccines to combat COVID, electric cars like Tesla, and the rockets of Elon Musk and Jeff Bezos. Neoliberal America, symbolized by Silicon Valley, is living on the technological capital inherited from developmentalist America, symbolized by the Pentagon.

This is critically important because world wars, hot and cold, are ultimately wars of industrial attrition. The three world wars that preceded today’s second cold war—World Wars I and II and Cold War I—demonstrate the importance of industrial power for victory. Imperial Germany and Nazi Germany and its allies were doomed once the U.S. mobilized its industrial might and entered the wars against them. When the Soviet Union under Gorbachev asked for a truce, the USSR was out-matched by the combined resources of the U.S., Western Europe, Japan, and its former ally communist China. In World War II, the U.S. and its allies directly bombed the factories and arsenals of their enemies. In World War I and Cold War I, the economies and morale of Imperial Germany and the Soviet Union, respectively, were strained to the breaking point without any foreign troops on the soil of the defeated power’s homeland.

Great powers, then, must not only have substantial populations and resources, but must also use them to support a world-class national industrial base in a prolonged and sustainable way. Neither state socialist crash programs that peter out over time nor bubbles and booms inflated by central banks in liberal market economies are adequate. To date in the industrial era, developmental states, both authoritarian like present-day China and democratic like the midcentury United States, have been more successful than communist regimes and free market liberal regimes. China has risen to the status of a second superpower on the basis of internal development and external trade, without waging the wars of choice on which the U.S. has squandered blood and treasure for a generation. Meanwhile, Vladimir Putin’s invasion of Ukraine is a reminder of the costs of wars of foreign conquest. So is the costly and humiliating failure of America’s two-decade misadventure in Afghanistan.

Daniferd on January 8th, 2023 at 14:55 UTC »

To offer a different prospective, I think population is not discussed enough and is perhaps one of the biggest determinants on the future status of superpowers on the global stage.

I extrapolate examples from the 19th to 20th centuries during the Age of Imperialism. This period saw European countries solidify their positions the world’s most economically, militarily, and technologically dominant states. During this period up until the Cold War, Europe’s population was twice that of Africa, 1/5-1/3 of Asia’s. The period after, saw Europe’s population growth slow significantly, while impoverished countries skyrocket.

To today where superpowers and potential contenders are countries with enormous populations. The US and China holding claim while India, Brazil, Nigeria, Indonesia might have chances to be contenders one day.

My point being, economic advantage is mostly determinant by the population that you can draw from. In 1940s, China’s population only exceeded US population by several hundred million. Now, it exceeds it by almost a billion. So therefore I believe the greatest disadvantage the US faces is that it’s population isn’t growing fast enough to maintain its economic dominance, and the loss of that economic dominance is the loss of hegemony.

(Sorry if it’s incoherent, I wrote this while in transit)

Tricky-Astronaut on January 8th, 2023 at 05:01 UTC »

If the history of the world wars of the industrial era holds any lessons, this is one: For the small number of populous nations that can aspire to great power status in the 21st century, the ability to bomb and invade weak, backward countries will be far less consequential than government competence and superior manufacturing capability.

One century later, the same conclusion will be drawn, once again citing the past century as proof. And many leaders will say that this time we have learned the lesson.

Strongbow85 on January 8th, 2023 at 03:50 UTC »

Submission Statement: Michael Lind analyzes the economic policies that led the United States to victory during the Cold War while emphasizing the importance of industrial power during conflict. The author notes that "to date in the industrial era, developmental states, both authoritarian like present-day China and democratic like the midcentury United States, have been more successful than communist regimes and free market liberal regimes." Lind cites the fall of the United Kingdom to a "second rank" power after financial interests (free trade) defeated manufacturing interests which were associated with protectionism. The industrial powerhouse that led the United States to victory during the first Cold War has transformed into an information and services-based economy. According to Lind, not only have we already entered a new Cold War, but the United States' economic policies have put the country at a disadvantage.

 

Key points from the article:

Looking back, it is clear that instead Cold War I was a three-way contest between communism and two kinds of capitalism—liberal or free market capitalism, and developmentalism, characterized by a mixed economy and state-directed industrial policy in support of targeted strategic industries.

In the 1990s, libertarians and neoliberals claimed that the information technology revolution proved the superiority of the free market to government when it comes to innovation. But the major tools of the computer age, from digitization to the global internet to the computer mouse were developed by government contractors reliant on U.S. taxpayer money.

It turns out that venture capitalists and advertisers are more interested in addictive online sites like Facebook and Twitter than in robots and cures for cancer. Without exception the major advances in basic technology during the post-1980s era of free market utopianism have been largely funded by the federal government. Think of the sequencing of the human genome, the vaccines to combat COVID, electric cars like Tesla, and the rockets of Elon Musk and Jeff Bezos. Neoliberal America, symbolized by Silicon Valley, is living on the technological capital inherited from developmentalist America, symbolized by the Pentagon.

This is critically important because world wars, hot and cold, are ultimately wars of industrial attrition. The three world wars that preceded today’s second cold war—World Wars I and II and Cold War I—demonstrate the importance of industrial power for victory.

Yet, in a strange turn, since the end of the last cold war America has rapidly pivoted away from the industrial model that proved so successful in the 20th century. The U.S. elite have wagered on transforming the country into an information and services-based economy. China, meanwhile, has adopted the older U.S. model of state-private sector cooperation, in the process becoming the world’s manufacturing base.

In the same way, tomorrow’s historians can argue about when Cold War II started. The date that I nominate is 2008—when Vladimir Putin responded to the possibility of Georgian membership in NATO by invading Georgia, and when China demonstrated its anti-satellite capability to the U.S. by shooting down one of its own satellites—a metaphorical “shot across the bow.” Whatever its origins, in Cold War II the Ukraine war represents as dramatic an escalation of hostilities as the Korean War did in Cold War I.