Feature

The Dustbin of History: Mutual Assured Destruction

According to Darwinism, species that adapt to their environment thrive; those that fail to evolve face extinction. The same is true for ideas. Marxism evolved from the primordial swamp of the Industrial Revolution but lies gasping for relevance after the collapse of the Soviet Union. Asian values -- fashionable when South Korea and Thailand were economic success stories and the West was mired in recession -- lost their luster following the 1997 Asian financial crisis. Mutual assured destruction kept the two Cold War superpowers in check but offers little assurance to nations threatened by suicide terrorists. The Club of Rome's doomsday prophecies of global starvation are now starved for credibility. The threat of the military-industrial complex is taken seriously only in Hollywood films and on conspiracy newsgroups. Dependency theory thrived amidst a backlash against economic imperialism yet withered in a globalized era of free trade and foreign investment.

Are these ideas really doomed to oblivion? Or, for all their flaws, do they still have some relevance? Can they make a comeback? FOREIGN POLICY has invited six notable minds to sort through the dustbin of history and share what they found.


It is a clue to the eventual demise of mutual assured destruction (MAD) that the term was coined by a critic who sought to highlight how ludicrous the concept was. In the 1960s, Donald Brennan -- an analyst at the conservative Hudson Institute, who was making the case for ballistic missile defense -- used the acronym MAD to ridicule the idea that in a nuclear war, or even a large conventional conflict, each side should be prepared to destroy the other's cities and society. Of course, this objective was not sensible, but MAD proponents argued that was the point: The outcome would be so dreadful that both sides would be deterred from starting a nuclear war or even taking actions that might lead to it.

Throughout much of the Cold War, U.S. declaratory policy (i.e., what policymakers said in public) closely approximated MAD. The view, most clearly articulated by then Secretary of Defense Robert McNamara, was that there was little utility in adding strategic weapons above those needed for MAD, that nuclear superiority was meaningless, that defense was useless, and that this bizarre configuration was in everyone's interest. Indeed, the implication was that the United States should not only avoid menacing the Soviets' retaliatory capability but also help the Soviets make their weapons invulnerable -- an idea that intrigued McNamara.

Critics like military strategists Herman Kahn and Colin Gray disagreed. They argued that nuclear warheads were immensely destructive but not qualitatively different from previous weapons of warfare. Consequently, the traditional rules of strategy applied: Security policy could only rest on credible threats (i.e., those that it made sense to carry out). The adoption of a policy that involved throwing up your hands and destroying the world if war actually broke out was not only the height of irresponsibility; MAD also failed to address the main strategic concern for the United States, which was to prevent the Soviets from invading Western Europe. The stability that MAD was supposed to provide actually would have allowed U.S. adversaries to use force below the nuclear level whenever it was to their advantage to do so. If the United States could not have threatened to escalate a conflict by using nuclear weapons, then the Soviets would have had free rein to fight and win a conventional war in Europe.

Privately, most generals and top civilian leaders were never convinced of the utility of MAD, and that skepticism was reflected in both Soviet and U.S. war planning. Each side strove for advantage, sought to minimize damage to its society, deployed defenses when deemed practical, and sought limited nuclear options that were militarily effective. Yet, for all these efforts, it is highly probable that a conventional war in Europe or, even more likely, the limited use of nuclear weapons would have prompted a full-scale nuclear war that would have resulted in mutual destruction.

MAD's credibility plummeted even further during the last stages of the Cold War, as the Soviet military buildup convinced U.S. policymakers that the U.S.S.R. did not believe in MAD and was seeking nuclear advantage. The Soviet Union's invasion of Afghanistan and its African adventures revealed that MAD could not protect all U.S. interests. In response, U.S. leaders talked about the significance of nuclear superiority and about the possibility of surviving a nuclear war. Most dramatically, President Ronald Reagan called for missile defense, declaring in 1983 that "to look down to an endless future with both of us sitting here with these horrible missiles aimed at each other and the only thing preventing a holocaust is just so long as no one pulls this trigger -- this is unthinkable."

Proponents of Reagan's anti-MAD policies credited them with helping to bring down the Soviet empire. Even those who disagreed had little reason to resurrect MAD in the aftermath of the Cold War. When the United States emerged as the dominant military power, defense became a much more attractive option than deterrence. Why threaten to punish another country for an attack when you can beat it back? According to MAD, trying to protect yourself is destabilizing because it threatens the other side. In a world where the United States faces no peer competitor that could threaten it with complete annihilation, thinking in these terms makes no sense. That's why no U.S. president since Jimmy Carter has been willing to renounce missile defense, despite the clear lack of foolproof technology. Indeed, even the simplest missiles are difficult to intercept. Ironically, primitive warheads that tumble in flight -- the very types of missiles that might be launched by low-tech U.S. adversaries such as Iraq or North Korea -- are harder to track than are more sophisticated ones. And adversaries could deliver nuclear weapons in a variety of other ways, such as by airplanes, ships, and cargo containers.

The threat of terrorism also makes defense preferable to deterrence. How do you deter a suicide bomber? In theory, the U.S. government could concoct a minimalist form of MAD by threatening retaliation in the form of killing terrorists' families or destroying Muslim holy sites. But these options are politically unpalatable. Defense, however, may not work either. Warding off 99 terrorist attacks does little good if the 100th succeeds, especially if weapons of mass destruction (WMD) are used. A defensive strategy that could achieve even 99 percent efficiency is hard to imagine short of incredible worldwide cooperation, expense, and sacrifice of civil liberties.

Confronted with these dilemmas, the Bush administration has turned to what it calls preemption, but what is actually prevention. (The difference between the two is in the timescale: The former means an attack against an adversary that is about to strike; the latter is a move to prevent a threat from fully emerging.) An adversary who cannot be deterred and whose attacks cannot be defended against must be stopped before it gains the capability to do great harm. This strategy makes more sense in theory than in practice, however. Moving before the threat fully materializes is rational only if the government is quite certain that failing to do so will lead to a disastrous attack by an adversary. But predicting the future accurately is quite difficult. The other side of this coin is that an adversary who believes the United States is certain to attack will have nothing to lose by resorting to WMD.

This doctrine of prevention has brought the United States full circle, with the Bush administration now echoing the refrain of early MAD critics who said that nuclear weapons were not qualitatively different from other kinds. As such, the White House has rejected one of the central precepts of MAD: Nuclear weapons are good for deterrence only. Instead, the Bush administration sees (or perhaps is looking for) significant preemptive military uses for nuclear weapons, such as destroying an adversary's wmd sites (silos or weapons facilities) that are buried deep underground.

MAD does not seem appropriate for rivals in the Third World either. In certain scenarios, deterrence still works to some degree. For instance, it would be suicidal for Pakistan to attack India with nuclear weapons. Even if Pakistan were able to destroy India's nuclear stockpile, India's armed forces could still dismember Pakistan. However, a nuclear war could begin if the Indian government launched a large military incursion aimed at destroying terrorist camps or punishing Pakistan for supporting these groups. The Pakistanis might decide, in turn, to use nuclear weapons on their own soil against invading forces. Indian officials have said that they would respond with nuclear weapons, but this threat might not be sufficiently credible to deter Pakistan in what would be a desperate situation. MAD may then be in the dustbin of history, but states that employ nuclear weapons or force their adversaries to do so may find themselves there as well.

Feature

The Dustbin of History: Asian Values

According to Darwinism, species that adapt to their environment thrive; those that fail to evolve face extinction. The same is true for ideas. Marxism evolved from the primordial swamp of the Industrial Revolution but lies gasping for relevance after the collapse of the Soviet Union. Asian values -- fashionable when South Korea and Thailand were economic success stories and the West was mired in recession -- lost their luster following the 1997 Asian financial crisis. Mutual assured destruction kept the two Cold War superpowers in check but offers little assurance to nations threatened by suicide terrorists. The Club of Rome's doomsday prophecies of global starvation are now starved for credibility. The threat of the military-industrial complex is taken seriously only in Hollywood films and on conspiracy newsgroups. Dependency theory thrived amidst a backlash against economic imperialism yet withered in a globalized era of free trade and foreign investment.

Are these ideas really doomed to oblivion? Or, for all their flaws, do they still have some relevance? Can they make a comeback? FOREIGN POLICY has invited six notable minds to sort through the dustbin of history and share what they found.


About a decade ago, East Asia was hot and so were "Asian values." In explaining East Asia's extraordinary economic development -- what the World Bank termed a "miracle" -- many believed that culture played a pivotal role. After all, so many Third World countries had tried to climb their way out of poverty, and only those of East Asia had fully succeeded. Singapore's brilliant patriarch Lee Kuan Yew became a world-class pundit, explaining how the unique culture of Confucianism permeated Asian societies. Many scholars agreed, perhaps none more forcefully than Joel Kotkin, who in his fascinating 1993 book, Tribes, essentially argued that if you want to succeed economically in the modern world, be Jewish, be Indian, but above all, be Chinese.

I have to confess that I found this theory appealing at first, since I am of Indian origin. But then I wondered, if being Indian is a key to economic success, what explained the dismal performance of the Indian economy over the four decades since its independence in 1947 or, for that matter, for hundreds of years before that? One might ask the same question of China, another country with an economy that performed miserably for hundreds of years until two decades ago. After all, if all you need are the Chinese, China has had hundreds of millions of them for centuries. As for Jews, they have thrived in many places, but the one country where they compose a majority, Israel, was also an economic mess until only recently. All three countries' economic fortunes improved markedly in the last three decades. But this turnaround did not occur because they got themselves new cultures. Rather, their governments changed specific policies and created more market-friendly systems. Today, China is growing faster than India, but that has more to do with the pace of China's economic reform than with the superiority of the Confucian ethic over the Hindu mind-set.

It is odd that Lee Kuan Yew is such a fierce proponent of cultural arguments. Singapore is not so culturally different from its neighbor, Malaysia. Singapore is more Chinese and less Malaysian, but compared with the rest of the world, the two are quite similar societies. But more so than its neighbors, Singapore has had an effective government that has pursued wise economic policies. It's not Confucius but Lee Kuan Yew that explains Singapore's success. The simplest proof is that, as Malaysia has copied the Singaporean model, it has also succeeded economically.

The discussion about Asian values was not simply a scholarly debate. Many Asian dictators used arguments about their region's unique culture to stop Western politicians from pushing them to democratize. The standard rebuttal was that Asians prefer order to the messy chaos of democracy. But East Asia's recent political history makes a powerful case for the universality of the democratic model -- if it is done right. Unlike other Third World countries, many in the region liberalized their economies first and then democratized their politics, thereby mirroring the sequence that took place in 19th-century Europe. The result has been the creation of remarkably stable democratic systems in Taiwan and South Korea, with more mixed but still impressive results in Thailand and Malaysia.

The point is not that culture is unimportant. On the contrary, it matters greatly. Culture represents the historical experience of a people, is embedded in their institutions, and shapes their attitudes and expectations about the world. But culture can change. German culture in 1939 was much different from what it became in 1959, just 20 years later. Europe, once the heartland of hypernationalism, is now post-nationalist; its states are willing to cede power to supranational bodies in ways Americans can hardly imagine. The United States was once an isolationist republic with a deep suspicion of standing armies. Today, it is a world hegemon with garrisons around the world. The Chinese were once backward peasants. Now they are smart merchants. Economic crises, war, political leadership -- all these circumstances change culture.

A century ago, when East Asia seemed immutably poor, many scholars (most famously German sociologist Max Weber) argued that Confucian-based cultures discouraged all the attributes necessary for success in capitalism. A decade ago, when East Asia was booming, scholars turned this explanation on its head, arguing that Confucianism actually emphasized the essential traits for economic dynamism. Then the wheel turned again, and many came to see in Asian values all the ingredients of crony capitalism. Lee Kuan Yew was compelled to admit that Confucian culture had bad traits as well, among them a tendency toward nepotism and favoritism. But surely recent revelations about some of the United States' largest corporations have shown that U.S. culture has its own brand of crony capitalism.

Weber linked northern Europe's economic success to its Protestant ethic and predicted that the Catholic south would stay poor. In fact, Italy and France have grown faster than Protestant Europe over the last half century. One may use the stereotype of shifty Latins and a mañana work ethic to explain the poor performance of some countries in the Southern Hemisphere, but then how does one explain Chile? Its economy is performing nearly as well as the strongest of the Asian tigers. Indeed, Chile's success is often attributed to another set of Latin values: strong families, religious values, and determination.

The truth is that there is no simple answer to why certain societies succeed at certain times. When a society does prosper, its success often seems inevitable in retrospect. So the instinct is to examine successful societies and search within their cultures for the seeds of success. Cultures are complex; one finds in them what one wants. If one wants to find cultural traits of hard work and thrift within East Asia, they are there. If one wants to find a tendency toward blind obedience and nepotism, these too exist. Look hard enough and most cultures exhibit these traits.

One would think that the experience with the Asian values debate would have undercut these kinds of cultural arguments. Yet having discarded this one, many have moved on to another. Now it is Islam's turn, but this time as a culture of evil. Rather than faulting bad leadership, politics, and policies in Muslim countries, many in the West -- including British historian Paul Johnson, Italian journalist Oriana Fallaci, and U.S. evangelical leader Pat Robertson -- have found it more comforting to fall back on grand generalizations about Islam. They will find that the one group of people who most strongly agrees with them are the Islamic fundamentalists who also believe that Islam's true nature is incompatible with the West, modernity, and democracy. But history will disprove this new version of the culture theory as it has the last.