How War Will End in Afghanistan -- Even if Conflict Does Not

If war has not addressed threats in Afghanistan, then the United States needs to address threats without war.

Here's the problem. How do you deal with an untrustworthy dictatorship threatening U.S. national interests?

The 20th-century solution was security through total victory: destroy the enemy's military and replace the dictator with democracy. When the defeat of Germany in World War I failed to prevent World War II, American leadership learned that lasting peace required both military victory and political transformation. As U.S. President Franklin D. Roosevelt's aide Breckinridge Long put it in 1942, "We are fighting this war because we did not have an unconditional surrender at the end of the last one." The failure of 1919 -- embodied in the looming White House portrait of President Woodrow Wilson, an architect of the Treaty of Versailles -- shadowed FDR. This time, Roosevelt declared, allied forces "must not allow the seeds of the evils we shall have crushed to germinate and reproduce themselves in the future."

It worked. Total victory over Germany, Italy, and Japan in 1945 transformed them into peaceful, prosperous, democratic allies of the United States, in what was the greatest foreign-policy accomplishment in U.S. history. Winning war went hand in hand with eliminating threats.

Fast-forward to 2001. As in 1941, the United States faced an untrustworthy, threatening dictatorship, this time a Taliban-ruled Afghanistan. The only solution seemed to be total military victory, an overthrow of the Taliban, and the installation of a democratic government. Two years later, the same approach was employed against Saddam Hussein's Iraq.

Now, to the present. What happened to the formula? Defeat of military forces, check. Attempted installation of democratic institutions, check. Achievement of peace, prosperity, and democratic maturity -- hold on. Eight years later, Afghanistan has neither stability nor democracy, much less prosperity. The Barack Obama administration is in the throes of a major debate about how to right the sinking ship of its Afghanistan strategy, with no clearly attractive options. Six years later, Iraqi violence is finally down to a dull roar, allowing U.S. forces to begin withdrawal. But Iraq still faces major hurdles, including designing institutions for provincial elections and allocating oil resources. Iraq in 2009 does not look much like Germany or Japan in 1951, six years after military defeat. Clearly, the United States took a wrong turn somewhere.

What lessons should Washington policymakers draw from the Afghanistan and Iraq experiences? An obvious lesson is that winning the war is the easy part, and the hard part comes after. The United States has become very, very good at dominating conventional army-against-army battles. But it's not so good at successfully installing democracy and defeating an insurgency when it erupts.

Less obvious, perhaps, is that failing to install democracy or avoid insurgency actually undermines the original goal: eliminating threats to U.S. national security. Military resources have been sucked into the two long conflicts. Counterinsurgency operations in Iraq have especially made the United States appear as a brutal occupier, fueling anti-Americanism and support for terrorist groups worldwide. And, the festering sore of Afghanistan has spilled over into Pakistan, threatening to destabilize a nuclear-armed state ruling a restive and increasingly radical Muslim population.

Where does that leave the United States? If war does not eliminate threats, then perhaps those threats need to be addressed without war.

Take, for example, two more states ruled by threatening, anti-American, nuclear-aspirant dictators: North Korea and Iran. Indisputably, the World War II option of invasion followed by imposed democratization is off the table. Iran is much larger and more difficult to conquer than Iraq or Afghanistan, and North Korea can defend itself with nuclear weapons it already possesses.

Some might suggest limited airstrikes against Iranian or North Korean nuclear facilities, echoing the 1981 Israeli attack on an Iraqi nuclear reactor. However, the success rate of such attacks, including the 1981 strike, is not encouraging. Future airstrikes are likely to be even less successful than past efforts, as new nuclear states now go to great lengths to disperse, conceal, and harden their nuclear facilities. The recent disclosure of Iran's Qom uranium enrichment facility probably does not provide a complete account of Iranian nuclear locations. Additional secret facilities likely exist, as Nima Gerami and James Acton recently argued on

If airstrikes are too risky, what's left? Everything other than war: diplomacy, negotiation, inspections, economic sanctions, and military deterrence. The bad news is that this cluster of approaches is unsatisfying for those who demand a rapid and decisive end to these threats. However, it does promise two advantages. First, these policies are low cost. Americans do not die in the course of diplomacy or inspections. The United States will not spend a trillion dollars executing economic sanctions. And, these approaches do not spur global anti-Americanism, and like it or not there is an important "popularity contest" element to the war on terror.

Second, this cluster of approaches does work. Nuclear deterrence has a perfect record in preventing the use of nuclear weapons by one state against another. As John Mearsheimer and Stephen Walt argued in the pages of Foreign Policy just prior to the 2003 Iraq war, deterrence can work against states like Iraq, North Korea, and Iran. North Korea has not attacked another state in more than half a century, including during the 15 or so years in which it has possessed nuclear weapons. Iran has been ruled by Islamist mullahs for 30 years and during that time has not attacked any of its neighbors, and certainly not any U.S. allies. The Soviet Union and China, two other nuclear-armed anti-American dictatorships, were deterred from attacking U.S. allies for decades.

Even the much-maligned combination of diplomacy, inspections, and sanctions has worked in the past, preventing nuclear proliferation in some states (Iraq, Libya, Argentina, and Brazil), persuading some states to give up their nuclear arsenals (South Africa, Ukraine, Belarus, and Kazakhstan), and slowing the acquisition and expansion of nuclear arsenals in other states (North Korea and Pakistan). The United States seems to be making progress on the Iranian program, as the solidification of Western resolve and the rising threat of greater economic sanctions seems to be pushing Iran to make concessions, including most recently the decision to permit inspection of the Qom uranium facility and discuss sending its enriched uranium abroad to be reprocessed into reactor fuel.

Nuclear-armed states do additionally threaten to distribute nuclear weapons, materials, or know-how to other states and nonstate actors. The United States is concerned about, among other things, North Korean assistance to Burma and Iranian assistance to Venezuela. Fortunately, there are nonviolent tools that have been and can be used to deal with these problems, including international institutions such as the Proliferation Security Initiative, the International Atomic Energy Agency, the Nuclear Suppliers Group, and the Missile Technology Control Regime. And, states have appeared to be unwilling to give such aid to terrorist groups, perhaps out of fear of building a nuclear Frankenstein that might one day turn on its sponsor. Certainly this was one reason why Saddam stayed away from al Qaeda.

The existence of dictatorships aspiring to acquire nuclear weapons is something to abhor. But aside from instances when another country has directly attacked the United States or supported groups that have launched major attacks on U.S. interests, war is not always the best means of dealing with them. Total victory, in the sense of complete military success followed by the complete elimination of a threat, is not a viable U.S. policy option in the 21st century. War is unlikely to bring security. But security can be had without war.



Power to the People

Why it's the poor -- not the experts -- who can best solve the food crisis.

Every non-governmental organization has a mission statement. For example, CARE, one of the world’s largest and best-funded NGOs, explains its mission as serving "individuals and families in the poorest communities in the world. Drawing strength from our global diversity, resources and experience, we promote innovative solutions and are advocates for global responsibility." Indeed, CARE has teams of experts with years of experience in more than 70 countries, and its efforts to tackle the "underlying causes of poverty" are impressive. Implicit in its mission statement, like those of most NGOs, is the notion that CARE is exceptionally knowledgeable about how to meet the needs of the world's poor. But does it know best?

Take one of the most confounding global problems today: the skyrocketing cost of food. Prices for staple crops such as rice and wheat have more than doubled since 2006, putting an enormous strain on the 1.2 billion people living on a dollar a day or less. In 2004, a typical poor farmer in Udaipur, India, was already spending more than half his daily dollar of income on food -- and that was before grain prices went through the roof.

NGOs and relief agencies are on the front lines of this global crisis, distributing food and other forms of assistance to the hardest-hit victims. But food handouts may be the last thing that poor countries need right now. In many of the worst-stricken places, agriculture is the top employer. High food prices are offering a rare opportunity for farmers in these countries to make a tidy profit. Dumping imported food on the market will cut into many farmers' incomes and thus might do more harm than good. Low-wage work programs could help people avoid hunger, but they might also take farmers away from their fields just when farming is becoming lucrative.

Priorities, moreover, vary from person to person and from place to place. A West African farmer might choose to forgo next season’s seeds and fertilizer to put food on the table today. A garbage collector in Jakarta might sacrifice trips to the doctor to keep from going hungry. Mexican parents might keep their kids home from school as the cost of education gets priced out of the family budget. Aid agencies can’t always predict what the poor value most.

The first step in truly addressing the food crisis, therefore, is abandoning the idea that the donor knows best. Instead of more advice or another bag of rice, the poor should be given relief vouchers. The basic premise is simple: Give poor people a choice about what type of assistance they receive. Vouchers, backed by major donor countries, could be distributed to needy recipients in the areas hardest hit by the food crisis. The recipients could then redeem the vouchers in exchange for approved goods (such as food or fertilizer) or services (such as healthcare or job training). Relief vouchers would allow families to meet their most pressing needs without harming the very markets that can bring about permanent solutions. At the same time, they would give firms and NGOs an incentive to provide a wider array of services.

Relief vouchers could also save NGOs millions of dollars that victims never see. Figuring out what people need is hard enough during a natural disaster, when a helicopter flyover can reveal the physical damage. But the effects of the food crisis are much harder to diagnose. Each NGO must conduct household surveys, hire experts, meet with local government officials and foreign donors, and then write grant applications and raise funds before it can ever help its first victim. Meanwhile, monitoring these efforts eats up precious resources. With vouchers, agencies would simply follow the invisible hand of the market -- in this case, the market for relief.

Relief vouchers would solve another problem: accountability. Most NGOs today answer only to the donors who fund their operations, not to their actual clients -- the poor. Most major donors do their utmost to make sure their money is spent as promised. But even donors whose hearts are in the right place cannot anticipate the exact needs of so many different communities. With no mechanism for the poor to communicate their priorities, nonprofits and their donors are only accountable to themselves. A system of relief vouchers would change that.

Such a radical shift in accountability will have major ramifications. The development world is littered with projects that keep getting funded long after they are no longer useful. Under a voucher system, if an NGO delivered a product that no one needed, or failed to deliver what it promised, beneficiaries would stop coming to it for relief. This is why nonprofits working for vouchers wouldn't have to waste funds on expensive evaluations. After all, Pepsi does not have to prove whether its soda makes its customers better off. Products that people aren't willing to buy typically don’t survive long. It is time to expose the nonprofit sector to the same market feedback.

If that scares some NGOs, it shouldn't. Too often, they must cater to the whims of donors when they would prefer to serve those in need. Without financial support, they would never be able to conduct their important work. But if a significant share of NGOs' financing came through voucher redemption, they would be able to focus their attention on the poor without worrying as much about pleasing large foundations and government agencies, which often have their own agendas.

Vouchers, of course, aren't a silver bullet. Corruption and fraud will be a concern. Moreover, some needs are best delivered at the community level, such as clean water, or at the national level, such as public-health campaigns. And in countries with well-developed national safety nets, such as South Africa, there may be no need to bypass functioning institutions by introducing vouchers. In some cases, relief vouchers would be impractical. Aid workers are fortunate if they can even reach those in need in a failed state like Somalia or a dictatorship like Burma.

Voucher schemes have already shown promise. Catholic Relief Services pioneered their use in 2000 by setting up "seed fairs" for farmers. In Ethiopia in 2004, the organization successfully introduced livestock vouchers for sheep, goats, and even veterinary services. The Red Cross distributed vouchers to vulnerable families in the West Bank in 2002 and 2003; the program was only discontinued for political reasons. Governments have long used other types of vouchers on larger scales: for schools, in many developing countries, and in the form of food stamps in the United States. Vouchers, in short, can work -- and it's time to extend their logic to a much wider array of problems. It's time to give the poor the power of choice.