Obama's Illegal War

Libya is important, but the U.S. Constitution is ultimately what we're fighting for.

The bombing campaign in Libya continues into its 72nd day without the consent of the U.S. Congress -- breaking the 60-day limit for unilateral presidential war-making. With the Justice Department providing no public explanation for this breach, lawmakers are beginning to take matters into their own hands.

On Wednesday, June 1, the House of Representatives delayed a vote on a resolution insisting that President Barack Obama bring the Libya mission to a speedy close; but expectations are that the measure will be reconsidered on Friday. The Senate, for its part, will soon take up a bipartisan measure supporting the war. Meanwhile, with no U.S. domestic debate, NATO has announced it will continue its operations in Libya for another 90 days.

We are at a constitutional crossroads, similar to the one the United States confronted in 1973 when Congress enacted the War Powers Resolution, which set the 60-day limit, over Richard Nixon's veto. The Constitution famously grants Congress the power to declare war, but Nixon continued to fight in Vietnam for three years after Congress had withdrawn the Gulf of Tonkin resolution authorizing the conflict.

Faced with this plain constitutional violation, Congress acted decisively to restore the system of checks and balances. For centuries, the president and Congress had wrangled over the kind of actions that counted as a "war" for constitutional purposes, with presidents exploiting legal ambiguities to cut Congress out of key decisions. The act broke this impasse by imposing a time limit on all "hostilities" -- a functional term meant to eliminate legalistic evasions the White House had developed over what counted as "war." Henceforward, the 60-day deadline would apply whenever the president began "hostilities," and if he failed to gain congressional approval, the act gave him 30 days to terminate the military operation.

This clear and simple 60/30-day setup is especially important at a time when other restraints on presidential war-making have atrophied. During the era of the Founding Fathers, Congress could back up its constitutional authority with its power of the purse. For example, when President George Washington responded to military defeats on the frontier by escalating the conflict, he got Congress to give him $532,449.76 and 2/3 cents for his war -- note the 2/3 cents!

It's a lot harder to do so now. The Libya campaign has already cost three-quarters of a billion dollars, and yet Obama hasn't had to ask Congress for a dime. He has funded the war entirely out of the general $600 billion appropriated to the Defense Department.

This leaves the time limit as the only effective mechanism for preserving the Founders' commitment to congressional control. Unlike with many other areas of law, the courts can't be counted on to translate abstract principles into concrete rules. So far as war-making is concerned, they have left it to the political branches to work the matter out -- which is precisely the purpose of the War Powers Resolution.

The Justice Department explicitly endorsed the constitutionality of the time-limit provisions in 1980, and presidents have abided by them ever since. When Ronald Reagan's multinational peacekeeping operation in Lebanon broke out into clear "hostilities," Congress passed -- and Reagan signed -- the first legislation expressly invoking the War Powers Resolution authorizing U.S. troops to remain for 18 months. Similarly, Bill Clinton gained a special appropriation from Congress within the first 60 days of his bombing campaign in Kosovo. And George W. Bush gained explicit congressional consent before launching the Afghanistan and Iraq wars -- as did his father at the time of the first Gulf War.

Obama's action is unprecedented. After notifying Congress that he had begun "hostilities," the president did absolutely nothing to gain congressional consent until Friday, May 20 -- just hours before the 60-day clock ran out. He then sent a letter to the House and Senate asking for their support, leaving it to Jay Carney, his press secretary, to explain that he "believes that he has acted … consistent with the War Powers Resolution … and that's all I'm going to say about it."

But actions speak louder than press secretaries. Even though the time limit set by the War Powers Resolution has expired, Secretary of State Hillary Clinton recently admitted that "the United States continues to fly 25 percent of all sorties. We continue to provide the majority of intelligence, surveillance, and reconnaissance assets. We continue to support all of our allies in their efforts." The United States continues, in short, to be involved in precisely the kind of "hostilities" that the War Powers Resolution is meant to control.

The president has the constitutional responsibility to "take care that the Laws" -- all the laws -- "be faithfully executed." The fate of Libya is important, but that is no excuse for ignoring the U.S. Constitution and the rule of law. Obama should belatedly heed the War Powers Resolution and press Congress to move quickly so that he can legalize the war within the 30-day deadline for terminating military action established by the act. This is the only way to keep faith with the Founders' commitment to checks and balances and to protect Americans against future presidential war-making without broad popular support.

Alex Wong/Getty Images


You Can't Always Get What You Want

What and Netflix can teach us about fighting poverty.

We usually think of "rich" as the opposite of "poor," and in some ways that makes a lot of sense. It's true, anyway, that rich people aren't poor, and poor people aren't rich. Thinking of rich and poor as opposites also feels natural because it introduces an obvious yardstick -- money -- for measuring how wealthy people are, and how much separates one person from another. Money is a convenient yardstick because it follows a simple rule: All other things equal, we'd always like to have another dollar. Think here of a line from The Simpsons, spoken by Montgomery Burns after Barney expresses admiration for his incredible wealth: "Yes, but I'd trade it all... for just a little more."

Unfortunately, the word "poor" has lots of opposites, and not all of them have to do directly with money. "Healthy," "well-educated," "having access of clean water," and "nourished" are among the many opposites of "poor," and when we think about the relative merits of antipoverty programs, we have to weigh each of these things -- and more -- against each other. But how do we compare the importance of, say, health versus education versus housing? And how do we make tradeoffs between them? One approach is to apply our own values and priorities, but this ignores the preferences of the very people for whose benefit these programs are designed. This happens often in the world of development aid; a donor focusing on education, for example, might care more about classroom quality than hospital beds. But wouldn't it be better if we could instead ask the people receiving our help what they want?

This isn't just about trying to please. Development aid lore is rife with stories of well-intentioned outsiders missing the mark, offering people goods and services they don't really want. Recipients sometimes manage to extract some value from unwanted items by trading them for things they actually do want, or by jury-rigging them to serve other purposes (often with limited success). A mosquito net may get swapped for a machete, for example, or a kitchen set might be sold in order to just buy food. If we want to avoid these outcomes, we must answer the question: How can we best understand people's priorities and tastes?

Outside the poverty field, there are a growing number of ways of ascertaining and predicting what people like, and all of them are imperfect -- but they're getting better. Think about the ubiquitous taste-based suggestions on and Netflix, for instance: "Customers who liked this also liked _____." This "taste-matching" approach looks for other people whose preferences are similar to yours, then recommends things those people like that you haven't tried yet. It stands to reason that taste-matching methods improve over time; they look at thousands of consumers' feedback about thousands of products, and see what patterns emerge. As more wide-ranging data is amassed from more consumers to inform these suggestions, they become increasingly accurate.

A hazard of taste-matching, however, is that it has a hard time dealing with quirks. Suppose you like Mediterranean food but hate olives. If that quirk isn't shared by many other lovers of Mediterranean food, it's unlikely that it will be reflected in the food recommendations you would get from a taste-matching approach.

When it comes to fighting poverty, missing those quirks can be a deal breaker. Here's a real example from central Kenya, where farmers of Gichugu Division, at the foot of Mount Kenya, grew impressive crops but were hamstrung by isolation. Without information about foreign markets or access to exporters, most grew products like maize and kale, which they sold to local consumers. DrumNet, an ambitious program designed by the nonprofit organization PRIDE AFRICA, sought to help by setting up an export supply chain and encouraging the farmers to adopt crops that were both well-suited to the weather and soil of Gichugu Division and in high demand elsewhere. Specifically, the program pushed for French beans, a favorite of European consumers.

When the program was in its infancy, Dean visited the area and found that most of the farmers weren't growing the beans. The farmers expressed concerns over the risks of the export markets. In particular, they feared that exporters would claim the French beans they had grown were of inferior quality and would offer them too low a price. If that happened, Dean asked, could the farmers just sell the beans locally, or even eat them? (These were delicious French beans, being sent to Europe to be served with fine wine and fancy gourmet meals!) They looked back as if they had been told to eat dirt. "French beans? Eat them? At best we would feed them to our pigs and goats."

If taste-matching doesn't work, what about dispensing with democracy and giving some people's preferences -- usually those of experts -- greater weight than others? Take the restaurant site Opinionated About Dining, for example, which considers every individual's opinion, but weights them according to the amount of experience that individual has dining in the finest venues. The reviews of people who eat frequently at highly rated restaurants (i.e. foodies) carry the most weight, while reviews from people who seldom eat out except for an occasional meal at, say, Denny's count for very little.

This kind of approach has the advantage of helping to sift the best opinions from the rest --provided there really is an objective answer to "which is better?" Again, though, when navigating the preferences of the poor, it's often difficult to rank things cleanly. A few years ago, for example, Jacob, an American working for Innovations for Poverty Action and living in Ghana, tried to take a friend, Oti, out for a special birthday dinner. For weeks beforehand, Jacob asked Oti to choose a restaurant or a cuisine that he wanted to try, but he always replied that he would be happy to eat chicken and rice at Papaye, the Ghanaian equivalent of a McDonald's Big Mac and fries. On the big day Jacob asked one last time: Wouldn't he like to try one of the good Thai or Indian or Chinese restaurants? Finally Oti buckled and chose Indian. To his credit, Oti gave the meal a fair shot. He tried every dish. But, despite his noble efforts to be polite, he just didn't like it. Fortunately, the waiter had been one step ahead all along. Soon after he brought out the last Indian dish -- and without being asked -- he returned to the table with a heaping plate of fried chicken gizzards, a local favorite. Oti was thrilled and relieved.

If, as outsiders, we have such a hard time navigating between French beans and kale, or daal and chicken gizzards, what hope do we have to make the right tradeoffs between food and shelter, or health and education?

Often the default approach is to consider all our options with the common denominator of dollars and cents. For instance, we could think about the value of a health program as the savings from reduced medical bills plus the value of additional working days which, absent the health program, would have been missed due to sickness. We could similarly think about the value of a scholarship education program as the aggregate additional income earned by participants who can attract higher wages because they have more schooling. And then we could compare the dollar values of the two programs to decide which one is better.

But there are many instances in which the alternatives in front of us stubbornly resist description in dollar terms, or where key ancillary assumptions, like how much participants value future versus current income, can flip the answer around. As DrumNet and Oti's Indian dinner illustrate, this can be tricky even when we're trying our best to accommodate people's preferences.

The stakes here are high. When a charity pushes particular choices (rather than handing out cash, for example), and gets recipients' preferences wrong, something is lost immediately: Either the things given simply go to waste, providing no benefit to anybody, or else recipients are effectively taxed by having to trade for the things they want or repurpose the aid to make it useful. The longer-term consequence of misreading -- or, worse, ignoring -- recipients' preferences is to demonstrate that charities aren't paying attention to the real needs and desires of the very people they claim to be serving.

No aid organization or government has yet figured out how to perfectly understand the preferences of their recipients, though they do try. Charities often ask community members questions like: "What are the biggest challenges you or your community face?" and "How can we help?" But of course, the process can go awry. If the residents of a rural village decided that what they really wanted was a deluxe swimming pool instead of a school or a health clinic, a charity would be forced to think about taste. Would they buckle and accept the residents' stated preferences? Here we suspect they would quickly become fans of the expertise-oriented Opinionated About Dining model. To twist a phrase from Orwell, they might argue that all preferences are created equal, but some preferences are better than others.

Looking at all the pitfalls of specific aid projects, an increasing cadre of experts has argued in recent years that it's better to just give recipients cash. That way, every individual can make a choice about what it is he or she needs most.

There are cases, however, when this might not be ideal.

First, in situations that economists call "market failures," handing out cash does not lead to socially optimal investments. Take contagious disease as an example: The personal benefit a small-scale farmer gets for taking preventative measures may not actually exceed what it costs him to do so. But since disease spreads from person to person, each individual who chooses not to protect himself imposes an additional cost on others. That's the market failure: With each person rationally choosing not to take preventative action, we end up with a suboptimal outcome for everyone as a whole -- a great risk of an epidemic. Another type of market failure comes when individuals yield to temptation; the planner in them may want a mosquito net, but they may be tempted when cash is in hand to buy something for the moment, like meat for dinner or even alcohol or cigarettes. Cases like this cry out for public action, and individuals may themselves ask for interventions to help resist their own future temptations.

Second, if no market failure exists, some may return to the question of tastes, and simply impose their own as so-called experts. The Opinionated About Dining approach suggests this may be useful. In the context of development, a charity could push for more spending on education and health simply because it believes that this would serve the recipients well.

When neither of the two cases above applies, there is a strong argument for simply providing cash. But donors also have preferences, and they often choose to support programs that align with them, whether that means spending on education, health, food, or something else. As researchers concerned with poverty, when one person asks us "What can I do to help increase the income of impoverished women in Zambia?" and another asks "What can I do to help improve the health of impoverished children in Peru?" we're reluctant to try to convince either to change their question. Donors are entitled to their own tastes, too. We are happy to see people respond actively to the plight of the poor, and urge donors to look at rigorous research on aid effectiveness to learn how to maximize the good they can do for the people and issues they care about most.