Think Again

Think Again: Nuclear Weapons

President Obama’s pledge to rid the world of atomic bombs is a waste of breath. But not for the reasons you might imagine.

""Nuclear Weapons Are the Greatest Threat to Humankind."

No. But you might think so if you listen to world leaders right now. In his first address to the U.N. Security Council, U.S. President Barack Obama warned apocalyptically, "Just one nuclear weapon exploded in a city -- be it New York or Moscow, Tokyo or Beijing, London or Paris -- could kill hundreds of thousands of people. And it would badly destabilize our security, our economies, and our very way of life." Obama has put nuclear disarmament back on the table in a way it hasn't been for decades by vowing to pursue a nuclear-free world, and, with a handful of big treaty negotiations in the works, he seems to think 2010 has become a critical year

But the conversation is based on false assumptions. Nuclear weapons certainly are the most destructive devices ever made, as Obama often reminds us, and everyone from peaceniks to neocons seems to agree. But for more than 60 years now all they've done is gather dust while propagandists and alarmists exaggerate their likelihood of exploding -- it was a certainty one would go off in 10 years, C.P. Snow authoritatively proclaimed in 1960 -- and nuclear metaphysicians spin fancy theories about how they might be deployed and targeted.

Nuclear weapons have had a tremendous influence on the world's agonies and obsessions, inspiring desperate rhetoric, extravagant theorizing, and frenetic diplomatic posturing. However, they have had very limited actual impact, at least since World War II. Even the most ingenious military thinkers have had difficulty coming up with realistic ways nukes could conceivably be applied on the battlefield; moral considerations aside, it is rare to find a target that can't be struck just as well by conventional weapons. Indeed, their chief "use" was to deter the Soviet Union from instituting Hitler-style military aggression, a chimera considering that historical evidence shows the Soviets never had genuine interest in doing anything of the sort. In other words, there was nothing to deter.

Instead, nukes have done nothing in particular, and have done that very well. They have, however, succeeded in being a colossal waste of money -- an authoritative 1998 Brookings Institution study showed the United States had spent $5.5 trillion on nukes since 1940, more than on any program other than Social Security. The expense was even more ludicrous in the cash-starved Soviet Union.

And that does not include the substantial loss entailed in requiring legions of talented nuclear scientists, engineers, and technicians to devote their careers to developing and servicing weapons that have proved to have been significantly unnecessary and essentially irrelevant. In fact, the only useful part of the expenditure has been on devices, protocols, and policies to keep the bombs from going off, expenditures that would, of course, not be necessary if they didn't exist.

"Obama's Plan to Eliminate Nuclear Weapons Is a Good One."

Not necessarily. Obama's plan, unveiled before the world in a speech in Prague last April, represents an ambitious attempt to rid the world of nukes. Under the president's scheme, developing countries would have access to an internationally monitored bank of nuclear fuel but would be barred from producing weapons-grade materials themselves. Existing warheads would be secured, and major powers such as Russia and the United States would pledge to scale back their weapons programs. In September, the U.N. General Assembly passed a resolution in support of Obama's proposal, giving his massive project some institutional backing.

But all of this is scarcely needed. Nuclear weapons are already disappearing, and elaborate international plans like the one Obama is pushing aren't needed to make it happen. During the Cold War, painstakingly negotiated treaties did little to advance the cause of disarmament -- and some efforts, such as the 1972 SALT Agreement, made the situation worse from a military standpoint. With the easing of tensions after the Cold War, a sort of negative arms race has taken place, and the weapons have been going away more or less by themselves as policymakers wake up to the fact that having fewer useless things is cheaper than having more of them. By 2002, the number of deployed warheads in Russian and U.S. arsenals had dropped from 70,000 to around 30,000, and it now stands at less than 10,000. "Real arms control," wistfully reflected former U.S. assistant secretary of state for arms control Avis Bohlen in an essay last May, "became possible only when it was no longer necessary."

Indeed, both sides have long found that arms reductions were made more difficult if they were accomplished through explicit mutual agreements requiring that an exquisitely nuanced arrangement be worked out for every abandoned nut and bolt. In 1991, for example, the Americans announced that they were unilaterally reducing tactical nuclear weapons, and the Soviet Union soon followed, a development hailed by a close observer, Brown University scholar Nina Tannenwald, as "the most radical move to date to reverse the arms race" and a "dramatic move away from 'warfighting' nuclear postures." This "radical" and "dramatic" feat was accomplished entirely without formal agreement. For the most part, the formal arms-control process has been left trying to catch up with reality. When the U.S. Senate in 1992 ratified a nuclear arms reduction treaty, both sides had already moved to reduce their weapons even further than required by that agreement.

France has also unilaterally cut its arsenal very substantially -- though explaining why France needs any nukes is surely a problématique worthy of several impenetrable dissertations. (Perhaps to threaten former colonies that might otherwise abandon French for English?) The British, too, are under domestic political pressure to cut their nuclear arsenal as they wrestle with how many of their aging nuclear subs they need to hang on to (how about: none?), and the Chinese have built far fewer of the weapons than they could have -- they currently stock just 180.

A negative arms race is likely to be as chaotic, halting, ambiguous, self-interested, and potentially reversible as a positive one. However, history suggests that arms reduction will happen best if arms negotiators keep out of the way. Formal disarmament agreements of the kind Obama seeks are likely simply to slow and clutter the process.

But all nukes are not likely to vanish entirely, no matter the method. Humanity invented these weapons, and there will still be nuclear metaphysicians around, spinning dark, improbable, and spooky theoretical scenarios to justify their existence.

"A Nuclear Explosion Would Cripple the U.S. Economy."

Only if Americans let it.Although former CIA chief George Tenet insists in his memoirs that one "mushroom cloud" would "destroy our economy," he never bothers to explain how the instant and tragic destruction of three square miles somewhere in the United States would lead inexorably to national economic annihilation. A nuclear explosion in, say, New York City -- as Obama so darkly invoked -- would obviously be a tremendous calamity that would roil markets and cause great economic hardship, but would it extinguish the rest of the country? Would farmers cease plowing? Would manufacturers close their assembly lines? Would all businesses, governmental structures, and community groups evaporate?

Americans are highly unlikely to react to an atomic explosion, however disastrous, by immolating themselves and their economy. In 1945, Japan weathered not only two nuclear attacks but intense nationwide conventional bombing; the horrific experience did not destroy Japan as a society or even as an economy. Nor has persistent, albeit nonnuclear, terrorism in Israel caused that state to disappear -- or to abandon democracy.

Even the notion that an act of nuclear terrorism would cause the American people to lose confidence in the government is belied by the traumatic experience of Sept. 11, 2001, when expressed confidence in America's leaders paradoxically soared. And it contradicts decades of disaster research that documents how socially responsible behavior increases under such conditions -- seen yet again in the response of those evacuating the World Trade Center on 9/11.

"Terrorists Could Snap Up Russia's Loose Nukes."

That's a myth. It has been soberly, and repeatedly, restated by Harvard University's Graham Allison and others that Osama bin Laden gave a group of Chechens $30 million in cash and two tons of opium in exchange for 20 nuclear warheads. Then there is the "report" about how al Qaeda acquired a Russian-made suitcase nuclear bomb from Central Asian sources that had a serial number of 9999 and could be exploded by mobile phone.

If these attention-grabbing rumors were true, one might think the terrorist group (or its supposed Chechen suppliers) would have tried to set off one of those things by now or that al Qaeda would have left some trace of the weapons behind in Afghanistan after it made its very rushed exit in 2001. Instead, nada. It turns out that getting one's hands on a working nuclear bomb is actually very difficult.

In 1998, a peak year for loose nuke stories, the head of the U.S. Strategic Command made several visits to Russian military bases and pointedly reported, "I want to put to bed this concern that there are loose nukes in Russia. My observations are that the Russians are indeed very serious about security." Physicists Richard Garwin and Georges Charpak have reported, however, that this forceful firsthand testimony failed to persuade the intelligence community "perhaps because it [had] access to varied sources of information." A decade later, with no credible reports of purloined Russian weapons, it rather looks like it was the general, not the spooks, who had it right.

By all reports (including Allison's), Russian nukes have become even more secure in recent years. It is scarcely rocket science to conclude that any nuke stolen in Russia is far more likely to go off in Red Square than in Times Square. The Russians seem to have had no difficulty grasping this fundamental reality.

Setting off a stolen nuke might be nearly impossible anyway, outside of TV's 24 and disaster movies. Finished bombs are routinely outfitted with devices that will trigger a nonnuclear explosion to destroy the bomb if it is tampered with. And, as Stephen Younger, former head of nuclear weapons research and development at Los Alamos National Laboratory, stresses, only a few people in the world know how to cause an unauthorized detonation of a nuclear weapon. Even weapons designers and maintenance personnel do not know the multiple steps necessary. In addition, some countries, including Pakistan, store their weapons disassembled, with the pieces in separate secure vaults.

"Al Qaeda Is Searching for a Nuclear Capability."

Prove it. Al Qaeda may have had some interest in atomic weapons and other weapons of mass destruction (WMD). For instance, a man who defected from al Qaeda after he was caught stealing $110,000 from the organization -- "a lovable rogue," "fixated on money," who "likes to please," as one FBI debriefer described Jamal al-Fadl -- has testified that members tried to purchase uranium in the mid-1990s, though they were scammed and purchased bogus material. There are also reports that bin Laden had "academic" discussions about WMD in 2001 with Pakistani nuclear scientists who did not actually know how to build a bomb.

But the Afghanistan invasion seems to have cut any schemes off at the knees. As analyst Anne Stenersen notes, evidence from an al Qaeda computer left behind in Afghanistan when the group beat a hasty retreat indicates that only some $2,000 to $4,000 was earmarked for WMD research, and that was mainly for very crude work on chemical weapons. For comparison, she points out that the Japanese millennial terrorist group, Aum Shinrikyo, appears to have invested $30 million in its sarin gas manufacturing program. Milton Leitenberg of the Center for International and Security Studies at the University of Maryland-College Park quotes Ayman al-Zawahiri as saying that the project was "wasted time and effort."

Even former International Atomic Energy Agency inspector David Albright, who is more impressed with the evidence found in Afghanistan, concludes that any al Qaeda atomic efforts were "seriously disrupted" -- indeed, "nipped in the bud" -- by the 2001 invasion of Afghanistan and that after the invasion the "chance of al Qaeda detonating a nuclear explosive appears on reflection to be low."

"Fabricating a Bomb Is 'Child's Play.'"

Hardly. An editorialist in Nature, the esteemed scientific journal, did apply that characterization to the manufacture of uranium bombs, as opposed to plutonium bombs, last January, but even that seems an absurd exaggeration. Younger, the former Los Alamos research director, has expressed his amazement at how "self-declared 'nuclear weapons experts,' many of whom have never seen a real nuclear weapon," continue to "hold forth on how easy it is to make a functioning nuclear explosive." Uranium is "exceptionally difficult to machine," he points out, and "plutonium is one of the most complex metals ever discovered, a material whose basic properties are sensitive to exactly how it is processed." Special technology is required, and even the simplest weapons require precise tolerances. Information on the general idea for building a bomb is available online, but none of it, Younger says, is detailed enough to "enable the confident assembly of a real nuclear explosive."

A failure to appreciate the costs and difficulties of a nuclear program has led to massive overestimations of the ability to fabricate nuclear weapons. As the 2005 Silberman-Robb commission, set up to investigate the intelligence failures that led to the Iraq war, pointed out, it is "a fundamental analytical error" to equate "procurement activity with weapons system capability." That is, "simply because a state can buy the parts does not mean it can put them together and make them work."

For example, after three decades of labor and well over $100 million in expenditures, Libya was unable to make any progress whatsoever toward an atomic bomb. Indeed, much of the country's nuclear material, surrendered after it abandoned its program, was still in the original boxes.

"Iranian and North Korean Nukes Are Intolerable."

Not unless we overreact. North Korea has been questing after nuclear capability for decades and has now managed to conduct a couple of nuclear tests that seem to have been mere fizzles. It has also launched a few missiles that have hit their presumed target, the Pacific Ocean, with deadly accuracy. It could do far more damage in the area with its artillery.

If the Iranians do break their solemn pledge not to develop nuclear weapons (perhaps in the event of an Israeli or U.S. airstrike on their facilities), they will surely find, like all other countries in our nuclear era, that the development has been a waste of time (it took Pakistan 28 years) and effort (is Pakistan, with its enduring paranoia about India and a growing jihadi threat, any safer today?).

Moreover, Iran will most likely "use" any nuclear capability in the same way all other nuclear states have: for prestige (or ego-stoking) and deterrence. Indeed, as strategist and Nobel laureate Thomas Schelling suggests, deterrence is about the only value the weapons might have for Iran. Such devices, he points out, "should be too precious to give away or to sell" and "too precious to 'waste' killing people" when they could make other countries "hesitant to consider military action."

If a nuclear Iran brandishes its weapons to intimidate others or get its way, it will likely find that those threatened, rather than capitulating or rushing off to build a compensating arsenal, will ally with others (including conceivably Israel) to stand up to the intimidation. The popular notion that nuclear weapons furnish a country with the ability to "dominate" its area has little or no historical support -- in the main, nuclear threats over the last 60 years have either been ignored or met with countervailing opposition, not with timorous acquiescence. It was conventional military might -- grunts and tanks, not nukes -- that earned the United States and the Soviet Union their respective spheres of influence during the Cold War.

In his 2008 campaign, Obama pointedly pledged that, as president, he would "do everything in my power to prevent Iran from obtaining a nuclear weapon … everything." Let us hope not: The anti-proliferation sanctions imposed on Iraq in the 1990s probably led to more deaths than the bombs dropped on Hiroshima and Nagasaki, and the same can be said for the ongoing war in Iraq, sold as an effort to root out Saddam Hussein's nukes. There is nothing inherently wrong with making nonproliferation a high priority, so long as it is topped with a somewhat higher one: avoiding policies that can lead to the deaths of tens or hundreds of thousands of people under the obsessive sway of worst-case-scenario fantasies.

Obama has achieved much in his first year as president on foreign policy through toning down rhetoric, encouraging openness toward international consultation and cooperation, and helping revise America's image as a threatening and arrogant loose cannon. That's certainly something to build on in year two.

The forging of nuclear arms reduction agreements, particularly with the Russians, could continue the process. Although these are mostly feel-good efforts that might actually hamper the natural pace of nuclear-arms reductions, there is something to be said for feeling good. Reducing weapons that have little or no value may not be terribly substantive, but it is one of those nice gestures that can have positive atmospheric consequences -- and one that can appear to justify certain Nobel awards.

The confrontations with Iran and North Korea over their prospective or actual nukes are more problematic. Obama and Secretary of State Hillary Clinton have already contributed big time to the hysteria that has become common coin within the foreign-policy establishment on this issue. It is fine to apply diplomacy and bribery in an effort to dissuade those countries from pursuing nuclear weapons programs: We'd be doing them a favor, in fact. But, though it may be heresy to say so, the world can live with a nuclear Iran or North Korea, as it has lived now for 45 years with a nuclear China, a country once viewed as the ultimate rogue. If push eventually comes to shove in these areas, the solution will be a familiar one: to establish orderly deterrent and containment strategies and avoid the temptation to lash out mindlessly at phantom threats.

CORNELL CAPA/MAGNUM PHOTOS

Think Again

Think Again: Sovereignty

The idea of states as autonomous, independent entities is collapsing under the combined onslaught of monetary unions, CNN, the Internet, and nongovernmental organizations. But those who proclaim the death of sovereignty misread history. The nation-state has a keen instinct for survival and has so far adapted to new challenges -- even the challenge of globalization.

"The Sovereign State Is Just About Dead"

Very wrong. Sovereignty was never quite as vibrant as many contemporary observers suggest. The conventional norms of sovereignty have always been challenged. A few states, most notably the United States, have had autonomy, control, and recognition for most of their existence, but most others have not. The polities of many weaker states have been persistently penetrated, and stronger nations have not been immune to external influence. China was occupied. The constitutional arrangements of Japan and Germany were directed by the United States after World War II. The United Kingdom, despite its rejection of the euro, is part of the European Union.

Even for weaker states -- whose domestic structures have been influenced by outside actors, and whose leaders have very little control over transborder movements or even activities within their own country -- sovereignty remains attractive. Although sovereignty might provide little more than international recognition, that recognition guarantees access to international organizations and sometimes to international finance. It offers status to individual leaders. While the great powers of Europe have eschewed many elements of sovereignty, the United States, China, and Japan have neither the interest nor the inclination to abandon their usually effective claims to domestic autonomy.

In various parts of the world, national borders still represent the fault lines of conflict, whether it is Israelis and Palestinians fighting over the status of Jerusalem, Indians and Pakistanis threatening to go nuclear over Kashmir, or Ethiopia and Eritrea clashing over disputed territories. Yet commentators nowadays are mostly concerned about the erosion of national borders as a consequence of globalization. Governments and activists alike complain that multilateral institutions such as the United Nations, the World Trade Organization, and the International Monetary Fund overstep their authority by promoting universal standards for everything from human rights and the environment to monetary policy and immigration. However, the most important impact of economic globalization and transnational norms will be to alter the scope of state authority rather than to generate some fundamentally new way to organize political life.

"Sovereignty Means Final Authority"

Not anymore, if ever. When philosophers Jean Bodin and Thomas Hobbes first elaborated the notion of sovereignty in the 16th and 17th centuries, they were concerned with establishing the legitimacy of a single hierarchy of domestic authority. Although Bodin and Hobbes accepted the existence of divine and natural law, they both (especially Hobbes) believed the word of the sovereign was law. Subjects had no right to revolt. Bodin and Hobbes realized that imbuing the sovereign with such overweening power invited tyranny, but they were predominately concerned with maintaining domestic order, without which they believed there could be no justice. Both were writing in a world riven by sectarian strife. Bodin was almost killed in religious riots in France in 1572. Hobbes published his seminal work, Leviathan, only a few years after parliament (composed of Britain's emerging wealthy middle class) had executed Charles I in a civil war that had sought to wrest state control from the monarchy.

This idea of supreme power was compelling, but irrelevant in practice. By the end of the 17th century, political authority in Britain was divided between king and parliament. In the United States, the Founding Fathers established a constitutional structure of checks and balances and multiple sovereignties distributed among local and national interests that were inconsistent with hierarchy and supremacy. The principles of justice, and especially order, so valued by Bodin and Hobbes, have best been provided by modern democratic states whose organizing principles are antithetical to the idea that sovereignty means uncontrolled domestic power.

If sovereignty does not mean a domestic order with a single hierarchy of authority, what does it mean? In the contemporary world, sovereignty primarily has been linked with the idea that states are autonomous and independent from each other. Within their own boundaries, the members of a polity are free to choose their own form of government. A necessary corollary of this claim is the principle of nonintervention: One state does not have a right to intervene in the internal affairs of another.

More recently, sovereignty has come to be associated with the idea of control over transborder movements. When contemporary observers assert that the sovereign state is just about dead, they do not mean that constitutional structures are about to disappear. Instead, they mean that technological change has made it very difficult, or perhaps impossible, for states to control movements across their borders of all kinds of material things (from coffee to cocaine) and not-so-material things (from Hollywood movies to capital flows).

Finally, sovereignty has meant that political authorities can enter into international agreements. They are free to endorse any contract they find attractive. Any treaty among states is legitimate provided that it has not been coerced.

"The Peace of Westphalia Produced the Modern Sovereign State"

No, it came later. Contemporary pundits often cite the 1648 Peace of Westphalia (actually two separate treaties, Münster and Osnabrück) as the political big bang that created the modern system of autonomous states. Westphalia -- which ended the Thirty Years' War against the hegemonic power of the Holy Roman Empire -- delegitimized the already waning transnational role of the Catholic Church and validated the idea that international relations should be driven by balance-of-power considerations rather than the ideals of Christendom. But Westphalia was first and foremost a new constitution for the Holy Roman Empire. The preexisting right of the principalities in the empire to make treaties was affirmed, but the Treaty of Münster stated that "such Alliances be not against the Emperor, and the Empire, nor against the Publick Peace, and this Treaty, and without prejudice to the Oath by which every one is bound to the Emperor and the Empire." The domestic political structures of the principalities remained embedded in the Holy Roman Empire. The Duke of Saxony, the Margrave of Brandenburg, the Count of Palatine, and the Duke of Bavaria were affirmed as electors who (along with the archbishops of Mainz, Trier, and Cologne) chose the emperor. They did not become or claim to be kings in their own right.

Perhaps most important, Westphalia established rules for religious tolerance in Germany. The treaties gave lip service to the principle (cuius regio, eius religio) that the prince could set the religion of his territory -- and then went on to violate this very principle through many specific provisions. The signatories agreed that the religious rules already in effect would stay in place. Catholics and Protestants in German cities with mixed populations would share offices. Religious issues had to be settled by a majority of both Catholics and Protestants in the diet and courts of the empire. None of the major political leaders in Europe endorsed religious toleration in principle, but they recognized that religious conflicts were so volatile that it was essential to contain rather than repress sectarian differences. All in all, Westphalia is a pretty medieval document, and its biggest explicit innovation -- provisions that undermined the power of princes to control religious affairs within their territories -- was antithetical to the ideas of national sovereignty that later became associated with the so-called Westphalian system.

"Universal Human Rights Are an Unprecedented Challenge to Sovereignty"

Wrong. The struggle to establish international rules that compel leaders to treat their subjects in a certain way has been going on for a long time. Over the centuries the emphasis has shifted from religious toleration, to minority rights (often focusing on specific ethnic groups in specific countries), to human rights (emphasizing rights enjoyed by all or broad classes of individuals). In a few instances states have voluntarily embraced international supervision, but generally the weak have acceded to the preferences of the strong: The Vienna settlement following the Napoleonic wars guaranteed religious toleration for Catholics in the Netherlands. All of the successor states of the Ottoman Empire, beginning with Greece in 1832 and ending with Albania in 1913, had to accept provisions for civic and political equality for religious minorities as a condition for international recognition. The peace settlements following World War I included extensive provisions for the protection of minorities. Poland, for instance, agreed to refrain from holding elections on Saturday because such balloting would have violated the Jewish Sabbath. Individuals could bring complaints against governments through a minority rights bureau established within the League of Nations.

But as the Holocaust tragically demonstrated, interwar efforts at international constraints on domestic practices failed dismally. After World War II, human, rather than minority, rights became the focus of attention. The United Nations Charter endorsed both human rights and the classic sovereignty principle of nonintervention. The 20-plus human rights accords that have been signed during the last half century cover a wide range of issues including genocide, torture, slavery, refugees, stateless persons, women's rights, racial discrimination, children's rights, and forced labor. These U.N. agreements, however, have few enforcement mechanisms, and even their provisions for reporting violations are often ineffective.

The tragic and bloody disintegration of Yugoslavia in the 1990s revived earlier concerns with ethnic rights. International recognition of the Yugoslav successor states was conditional upon their acceptance of constitutional provisions guaranteeing minority rights. The Dayton accords established externally controlled authority structures in Bosnia, including a Human Rights Commission (a majority of whose members were appointed by the Western European states). NATO created a de facto protectorate in Kosovo.

The motivations for such interventions -- humanitarianism and security -- have hardly changed. Indeed, the considerations that brought the great powers into the Balkans following the wars of the 1870s were hardly different from those that engaged NATO and Russia in the 1990s.

"Globalization Undermines State Control"

No. State control could never be taken for granted. Technological changes over the last 200 years have increased the flow of people, goods, capital, and ideas -- but the problems posed by such movements are not new. In many ways, states are better able to respond now than they were in the past.

The impact of the global media on political authority (the so-called CNN effect) pales in comparison to the havoc that followed the invention of the printing press. Within a decade after Martin Luther purportedly nailed his 95 theses to the Wittenberg church door, his ideas had circulated throughout Europe. Some political leaders seized upon the principles of the Protestant Reformation as a way to legitimize secular political authority. No sovereign monarch could contain the spread of these concepts, and some lost not only their lands but also their heads. The sectarian controversies of the 16th and 17th centuries were perhaps more politically consequential than any subsequent transnational flow of ideas.

In some ways, international capital movements were more significant in earlier periods than they are now. During the 19th century, Latin American states (and to a lesser extent Canada, the United States, and Europe) were beset by boom-and-bust cycles associated with global financial crises. The Great Depression, which had a powerful effect on the domestic politics of all major states, was precipitated by an international collapse of credit. The Asian financial crisis of the late 1990s was not nearly as devastating. Indeed, the speed with which countries recovered from the Asian flu reflects how a better working knowledge of economic theories and more effective central banks have made it easier for states to secure the advantages (while at the same time minimizing the risks) of being enmeshed in global financial markets.

In addition to attempting to control the flows of capital and ideas, states have long struggled to manage the impact of international trade. The opening of long-distance trade for bulk commodities in the 19th century created fundamental cleavages in all of the major states. Depression and plummeting grain prices made it possible for German Chancellor Otto von Bismarck to prod the landholding aristocracy into a protectionist alliance with urban heavy industry (this coalition of "iron and rye" dominated German politics for decades). The tariff question was a basic divide in U.S. politics for much of the last half of the 19th and first half of the 20th centuries. But, despite growing levels of imports and exports since 1950, the political salience of trade has receded because national governments have developed social welfare strategies that cushion the impact of international competition, and workers with higher skill levels are better able to adjust to changing international conditions. It has become easier, not harder, for states to manage the flow of goods and services.

"Globalization Is Changing the Scope of State Control"

Yes. The reach of the state has increased in some areas but contracted in others. Rulers have recognized that their effective control can be enhanced by walking away from issues they cannot resolve. For instance, beginning with the Peace of Westphalia, leaders chose to surrender their control over religion because it proved too volatile. Keeping religion within the scope of state authority undermined, rather than strengthened, political stability.

Monetary policy is an area where state control expanded and then ultimately contracted. Before the 20th century, states had neither the administrative competence nor the inclination to conduct independent monetary policies. The mid-20th-century effort to control monetary affairs, which was associated with Keynesian economics, has now been reversed due to the magnitude of short-term capital flows and the inability of some states to control inflation. With the exception of Great Britain, the major European states have established a single monetary authority. Confronting recurrent hyperinflation, Ecuador adopted the U.S. dollar as its currency in 2000.

Along with the erosion of national currencies, we now see the erosion of national citizenship -- the notion that an individual should be a citizen of one and only one country, and that the state has exclusive claims to that person's loyalty. For many states, there is no longer a sharp distinction between citizens and noncitizens. Permanent residents, guest workers, refugees, and undocumented immigrants are entitled to some bundle of rights even if they cannot vote. The ease of travel and the desire of many countries to attract either capital or skilled workers have increased incentives to make citizenship a more flexible category.

Although government involvement in religion, monetary affairs, and claims to loyalty has declined, overall government activity, as reflected in taxation and government expenditures, has increased as a percentage of national income since the 1950s among the most economically advanced states. The extent of a country's social welfare programs tends to go hand in hand with its level of integration within the global economy. Crises of authority and control have been most pronounced in the states that have been the most isolated, with sub-Saharan Africa offering the largest number of unhappy examples.

"NGOs Are Nibbling at National Sovereignty"

To some extent. Transnational nongovernmental organizations (NGOs) have been around for quite awhile, especially if you include corporations. In the 18th century, the East India Company possessed political power (and even an expeditionary military force) that rivaled many national governments. Throughout the 19th century, there were transnational movements to abolish slavery, promote the rights of women, and improve conditions for workers.

The number of transnational NGOs, however, has grown tremendously, from around 200 in 1909 to over 17,000 today. The availability of inexpensive and very fast communications technology has made it easier for such groups to organize and make an impact on public policy and international law -- the international agreement banning land mines being a recent case in point. Such groups prompt questions about sovereignty because they appear to threaten the integrity of domestic decision making. Activists who lose on their home territory can pressure foreign governments, which may in turn influence decision makers in the activists' own nation.

But for all of the talk of growing ngo influence, their power to affect a country's domestic affairs has been limited when compared to governments, international organizations, and multinational corporations. The United Fruit Company had more influence in Central America in the early part of the 20th century than any ngo could hope to have anywhere in the contemporary world. The International Monetary Fund and other multilateral financial institutions now routinely negotiate conditionality agreements that involve not only specific economic targets but also domestic institutional changes, such as pledges to crack down on corruption and break up cartels.

Smaller, weaker states are the most frequent targets of external efforts to alter domestic institutions, but more powerful states are not immune. The openness of the U.S. political system means that not only NGOs, but also foreign governments, can play some role in political decisions. (The Mexican government, for instance, lobbied heavily for the passage of the North American Free Trade Agreement.) In fact, the permeability of the American polity makes the United States a less threatening partner; nations are more willing to sign on to U.S.-sponsored international arrangements because they have some confidence that they can play a role in U.S. decision making.

"Sovereignty Blocks Conflict Resolution"

Yes, sometimes. Rulers as well as their constituents have some reasonably clear notion of what sovereignty means -- exclusive control within a given territory -- even if this norm has been challenged frequently by inconsistent principles (such as universal human rights) and violated in practice (the U.S.- and British-enforced no-fly zones over Iraq). In fact, the political importance of conventional sovereignty rules has made it harder to solve some problems. There is, for instance, no conventional sovereignty solution for Jerusalem, but it doesn't require much imagination to think of alternatives: Divide the city into small pieces; divide the Temple Mount vertically with the Palestinians controlling the top and the Israelis the bottom; establish some kind of international authority; divide control over different issues (religious practices versus taxation, for instance) among different authorities. Any one of these solutions would be better for most Israelis and Palestinians than an ongoing stalemate, but political leaders on both sides have had trouble delivering a settlement because they are subject to attacks by counterelites who can wave the sovereignty flag.

Conventional rules have also been problematic for Tibet. Both the Chinese and the Tibetans might be better off if Tibet could regain some of the autonomy it had as a tributary state within the traditional Chinese empire. Tibet had extensive local control, but symbolically (and sometimes through tribute payments) recognized the supremacy of the emperor. Today, few on either side would even know what a tributary state is, and even if the leaders of Tibet worked out some kind of settlement that would give their country more self-government, there would be no guarantee that they could gain the support of their own constituents.

If, however, leaders can reach mutual agreements, bring along their constituents, or are willing to use coercion, sovereignty rules can be violated in inventive ways. The Chinese, for instance, made Hong Kong a special administrative region after the transfer from British rule, allowed a foreign judge to sit on the Court of Final Appeal, and secured acceptance by other states not only for Hong Kong's participation in a number of international organizations but also for separate visa agreements and recognition of a distinct Hong Kong passport. All of these measures violate conventional sovereignty rules since Hong Kong does not have juridical independence. Only by inventing a unique status for Hong Kong, which involved the acquiescence of other states, could China claim sovereignty while simultaneously preserving the confidence of the business community.

"The European Union Is a New Model for Supranational Governance"

Yes, but only for the Europeans. The European Union (EU) really is a new thing, far more interesting in terms of sovereignty than Hong Kong. It is not a conventional international organization because its member states are now so intimately linked with one another that withdrawal is not a viable option. It is not likely to become a "United States of Europe" -- a large federal state that might look something like the United States of America -- because the interests, cultures, economies, and domestic institutional arrangements of its members are too diverse. Widening the EU to include the former communist states of Central Europe would further complicate any efforts to move toward a political organization that looks like a conventional sovereign state.

The EU is inconsistent with conventional sovereignty rules. Its member states have created supranational institutions (the European Court of Justice, the European Commission, and the Council of Ministers) that can make decisions opposed by some member states. The rulings of the court have direct effect and supremacy within national judicial systems, even though these doctrines were never explicitly endorsed in any treaty. The European Monetary Union created a central bank that now controls monetary affairs for three of the union's four largest states. The Single European Act and the Maastricht Treaty provide for majority or qualified majority, but not unanimous, voting in some issue areas. In one sense, the European Union is a product of state sovereignty because it has been created through voluntary agreements among its member states. But, in another sense, it fundamentally contradicts conventional understandings of sovereignty because these same agreements have undermined the juridical autonomy of its individual members.

The European Union, however, is not a model that other parts of the world can imitate. The initial moves toward integration could not have taken place without the political and economic support of the United States, which was, in the early years of the Cold War, much more interested in creating a strong alliance that could effectively oppose the Soviet Union than it was in any potential European challenge to U.S. leadership. Germany, one of the largest states in the European Union, has been the most consistent supporter of an institutional structure that would limit Berlin's own freedom of action, a reflection of the lessons of two devastating wars and the attractiveness of a European identity for a country still grappling with the sins of the Nazi era. It is hard to imagine that other regional powers such as China, Japan, or Brazil, much less the United States, would have any interest in tying their own hands in similar ways. (Regional trading agreements such as Mercosur and NAFTA have very limited supranational provisions and show few signs of evolving into broader monetary or political unions.) The EU is a new and unique institutional structure, but it will coexist with, not displace, the sovereign-state model.