Think Again

Think Again: Microfinance

Small loans probably won't lift people out of poverty or empower women. But that doesn't mean they're useless.


"Microcredit Is a Proven Weapon Against Poverty."

Alas, no. Microcredit, the strategy of lending sums as small as $100 to help poor people start tiny businesses, has won acclaim like few other recent concepts in economic development, winning plaudits from political leaders, titans of industry, and celebrities. Bill Clinton and Tony Blair love microcredit. So do Queen Rania and Natalie Portman. More than 100 million people in more than 100 countries have received microloans, thanks in no small part to billions of dollars from foreign aid agencies, philanthropists, and "social investors" looking to do well while doing good. In 2006, microcredit pioneer Muhammad Yunus and the Grameen Bank he founded in Bangladesh shared the Nobel Peace Prize. Microcredit has gained a global reputation for lifting people out of poverty and empowering women.

What has made so many so sure of microcredit? The ideas are powerful: a blend of self-reliance and liberation that appeals across the political spectrum. Microfinance promoters told compelling stories of individual men and women whose successes embodied those ideas, and papers in prestigious journals gave convincing evidence that the loans, especially when they went to women, made them less poor.

But the old studies are now discredited. Newer, better ones have found that microloans rarely make an impact on bottom-line indicators of poverty, such as how much a household spends each month and whether its children are in school.

The reversal of this academic verdict is a sign of a larger shift in development economics, toward randomizing in order to pin down cause and effect. If you observe that less-poor people are more likely to have taken microcredit, it is hard to know what caused what: Did the microcredit make them better off, or did being better off make them readier to borrow? If you instead flip a coin to decide who in a village will be offered microcredit and who will not -- randomizing -- and then observe that the fates of the two groups diverge over time, you can more accurately observe what effect the loans are having on those who receive them.

Recent randomized studies in India, Mongolia, Morocco, and the Philippines have found that access to microcredit does stimulate microbusiness start-ups -- raising chickens, say, or sewing saris. But across the 12-18 months over which progress was tracked, the loans did not reduce poverty. So today the best estimate of the impact of microcredit on poverty is zero. (In retrospect, reverse causation cannot be ruled out as the source of the more upbeat findings of earlier, nonrandomized studies.)

This finding clashes with the microcredit mythology. But it comports with common sense. If you're reading this article online, you probably belong to the global middle class, the billion or so people who earn steady wages and lead lives of material comfort. What in your family history lifted you to your enviable perch? It probably wasn't tiny loans to your indigent ancestors so they could raise goats. Then, as now, most poor people's best hope for escaping poverty lies in graduating from tenuous self-employment to steady employment -- to jobs, which are the fruit of industrialization.



"Microfinance Is Useless."

No. It would be wrong to overreact to the hype about microloans and dismiss the entire enterprise as a waste of money and effort. Twenty years ago, journalist Helen Todd spent a year following the lives of 62 women in two Bangladeshi villages served by Yunus's famous Grameen Bank. Of the 40 who took microcredit from Grameen, all stated business plans to get the loans: They would buy cows to fatten or rice to husk and resell. A few actually did those things, but most used the money to buy or lease land, repay other loans, stock up on rice for the family, or finance dowries and weddings.

That's probably just fine. As the book Portfolios of the Poor shows, the people said to live on $2 a day actually don't. They live on $3 one day, $1 the next, and $2.50 the day after. Or they are farmers who earn money once a season. But their children need to be fed every day, and husbands don't fall ill on convenient schedules. The need to match an unpredictable income to spending needs with different rhythms generates an intense demand among poor people for financial services that help them set aside money in good times, when they need it less, and draw it down in bad.

All financial services help meet this demand, however imperfectly: loans, savings accounts, insurance, money transfers. A mother can pay the doctor for treating her daughter by getting an emergency loan from a friend, depleting savings, persuading her brother in the city to send money, or even -- if she is very lucky -- using health insurance. That is why the microcredit movement became the microfinance movement and today supports other services along with loans.

Poor people have less money than the rich, but they aren't dumber; in fact they are generally more resourceful out of necessity. If a woman uses a microloan to buy rice or repair a roof instead of starting a business, I hesitate to second-guess her. People in wealthy countries see fit to buy everything from food to houses on credit. Should we expect the poor to differ?



"Muhammad Yunus Invented Microcredit."

Yes, just as Henry Ford invented the car. Where Ford had the assembly line, Yunus's breakthrough innovation was joint liability, the practice of making small groups of borrowers -- the women of a particular village, for instance -- collectively responsible for each other's loans. The vouching for peers substituted for collateral and produced astonishingly high repayment rates.

Joint liability was not new, however. Proverbs 11:15 warns, "A foolish man hands over his bounty which he pledges for his neighbor as security." A similar concept was also at the core of the credit cooperatives that sprouted across Germany starting in the 1850s, in which groups of poor people would band together, borrow from outside benefactors, and then divvy out the credit among themselves. Around 1900, seeking to quell unrest, the British introduced credit groups into colonial India, which included the territory of modern Bangladesh. In the late 1970s, these already functioning cooperatives inspired Yunus and his students as they built their own microcredit method by trial and error.

Yet the comparison to the carmaker is apt. Truly, Yunus is the Henry Ford of microfinance. Over the course of 28 years, until Bangladesh's prime minster forced him out in 2011 in an act of political spite, Yunus built a bank with thousands of employees delivering useful services to millions of customers. He inspired competition within Bangladesh and imitation beyond, which led to a steady stream of new innovations in the name of serving the poor, including savings accounts and more flexible loans. He was the first leader of the modern microcredit movement to operate in a relatively businesslike way: to mass-produce and charge the poor enough interest to cover most operating costs so that the bank could expand to serve more people.



"Microcredit Empowers Women."

Not so much. The microcredit movement began in the 1970s. In sync with the global movement for gender equality that began at the same time, microcredit has focused mainly on women. Promoters have asserted that the loans "empower" female borrowers. Women who came home with loans, it was said, gained more leverage vis-à-vis their husbands in household decisions about whether to buy food or beer, to invest or consume. Meanwhile, women who had been traditionally confined by their culture to the domestic sphere, as in Bangladesh, found liberation in being able to conduct business in public at the weekly meetings where loan installments were paid. Some nonprofit microfinance programs include classes about such subjects as basic accounting and prenatal nutrition.

But though credit is a source of possibilities, it is also a bond -- potentially an oppressive one when enforced through peer pressure. Indeed, greater sensitivity to social pressure helps explain why microlenders have favored women: In many cases, they have paid back more reliably, putting up less argument than men.

Anthropological studies have found a mix of stories about the link between credit and empowerment. In some cases, women gain increments of liberation, just as hoped. After studying female microcredit users in Bangladesh in the mid-1990s, Syed Hashemi, Sidney Schuler, and Ann Riley concluded in the academic journal World Development that the Grameen Bank had empowered female borrowers on average. They wrote:

Several of the women … told the field investigators that through Grameen Bank they had "learned to talk," and now they were not afraid to talk to outsiders. In both programs some members have the opportunity to play leadership roles. One woman told the researchers, "I have been made the [borrowing group] Chief. Now all of the other women listen to me and give me their attention. Grameen Bank has made me important."

But there are also sad stories. Anthropologist Lamia Karim has documented how in Bangladesh, where most borrowers are female, women who defaulted have had their possessions -- in extreme cases, their houses -- carted off by their jointly liable peers to be sold to repay their loans.

From what I can tell from the fragmentary evidence, the most famous form of microcredit -- group-based credit as pioneered by Grameen -- is the least empowering and most fraught with risk, because of the way it marshals peer pressure to enforce loan repayment. Individual microloans, given one-on-one, without the burdens of weekly group meetings and peer pressure, appear to have less of a dark side. If microbank staff can't outsource loan decisions to the group, though, they must spend more time vetting customers, making the whole enterprise less profitable and less likely to focus on the neediest.



"Microcredit Is Immune to the Irrationalities of Mainstream Finance."

Absolutely not. The hype made it seem like more money for microcredit is always better. But microcredit is actually more prone than conventional credit to overheating and bubbles. It suffers from two vulnerabilities: a general lack of credit bureaus to track the indebtedness of low-income people, which leaves creditors flying blind; and the irrational exuberance about microcredit as a way to help the poor, which has unleashed a flood of capital from well-meaning people and institutions.

Most of this cross-border capital flow -- some $3 billion in 2010 -- has gone straight into microloans rather than business-building activities such as training and computer purchases. The stock of outstanding microdebt has grown 30 percent or more per year in many countries. The pace has proved faster than some lenders and borrowers could safely manage. In Nicaragua, after a nationwide debtor's revolt won backing from President Daniel Ortega, the tide of defaults destroyed one of the largest microcreditors, Banex. In the last five years, bubbles have also inflated and popped in Bosnia-Herzegovina, Morocco, and parts of Pakistan. In the short run, that has been good news for borrowers who took loans and then defaulted. (After all, if the lenders lost a lot of money, that money went somewhere!) But in the long view, damaging the industry reduces access to finance.

Then there is the Indian state of Andhra Pradesh, where, before the overheated market could implode on its own, the state government in 2010 essentially shut down the industry overnight. Visiting shortly afterward, I learned of villages where microcreditors were so plentiful that they were known by the day of the week on which their clients gathered to get loans and make payments. Some women had loans for every day of the week.

The bottom line: Microfinance is no silver bullet for poverty, but it does have things to offer. The strength of the movement is not in reducing poverty or empowering women, but in building dynamic institutions that deliver inherently useful services to millions of poor people. Imagine your life without financial services: no bank account, no insurance, no loans for a house or an education; just cash in your pocket or under your mattress. Poor people transact in smaller denominations, but they have to solve financial problems at least as tough as yours. They need and deserve such services too, just as they do clean water and electricity. The microfinance movement is about building businesses and business-like nonprofits that mass-produce financial services for the poor -- not just microcredit, but microsavings, microinsurance, and micro money transfers too.

The well-meaning flood of money into microcredit distorts the industry toward overreliance on this one, risky service. It is the greatest threat to the greatest strength of microfinance as a whole. That is why the hype about microcredit has been not merely misleading but destructive. And that is why less money should go into microcredit, not more.


Think Again

Think Again: Intelligence

I served in the CIA for 28 years and I can tell you: America's screw-ups come from bad leaders, not lousy spies.

"Presidents Make Decisions Based on Intelligence."

Not the big ones. From George W. Bush trumpeting WMD reports about Iraq to this year's Republican presidential candidates vowing to set policy in Afghanistan based on the dictates of the intelligence community, Americans often get the sense that their leaders' hands are guided abroad by their all-knowing spying apparatus. After all, the United States spends about $80 billion on intelligence each year, which provides a flood of important guidance every week on matters ranging from hunting terrorists to countering China's growing military capabilities. This analysis informs policymakers' day-to-day decision-making and sometimes gets them to look more closely at problems, such as the rising threat from al Qaeda in the late 1990s, than they otherwise would.

On major foreign-policy decisions, however, whether going to war or broadly rethinking U.S. strategy in the Arab world (as President Barack Obama is likely doing now), intelligence is not the decisive factor. The influences that really matter are the ones that leaders bring with them into office: their own strategic sense, the lessons they have drawn from history or personal experience, the imperatives of domestic politics, and their own neuroses. A memo or briefing emanating from some unfamiliar corner of the bureaucracy hardly stands a chance.

Besides, one should never underestimate the influence of conventional wisdom. President Lyndon B. Johnson and his inner circle received the intelligence community's gloomy assessments of South Vietnam's ability to stand on its own feet, as well as comparably pessimistic reports from U.S. military leaders on the likely cost and time commitment of a U.S. military effort there. But they lost out to the domino theory -- the idea that if South Vietnam fell to communism, a succession of other countries in the developing world would as well. President Harry Truman decided to intervene in Korea based on the lessons of the past: the Allies' failure to stand up to the Axis powers before World War II and the West's postwar success in firmly responding to communist aggression in Greece and Berlin. President Richard Nixon's historic opening to China was shaped by his brooding in the political wilderness about great-power strategy and his place in it. The Obama administration's recent drumbeating about Iran is largely a function of domestic politics. Advice from Langley, for better or worse, had little to do with any of this.

Alex Wong/Getty Images

"Bad Intelligence Led to the Iraq War."

No, bad leadership did. Intelligence may have figured prominently in Bush's selling of the invasion of Iraq, but it played almost no role in the decision itself. If the intelligence community's assessments pointed to any course of action, it was avoiding a war, not launching one.

When U.S. Secretary of State Colin Powell went before the United Nations in February 2003 to make the case for an invasion of Iraq, he argued, "Saddam Hussein and his regime are concealing their efforts to produce more weapons of mass destruction," an observation he said was "based on solid intelligence." But in a candid interview four months later, Deputy Defense Secretary Paul Wolfowitz acknowledged that weapons of mass destruction were simply "the one issue that everyone could agree on." The intelligence community was raising no alarms about the subject when the Bush administration came into office; indeed, the 2001 edition of the community's comprehensive statement on worldwide threats did not even mention the possibility of Iraqi nuclear weapons or any stockpiles of chemical or biological weapons. The administration did not request the (ultimately flawed) October 2002 intelligence estimate on Iraqi unconventional weapons programs that was central to the official case for invasion -- Democrats in Congress did, and only six senators and a handful of representatives bothered to look at it before voting on the war, according to staff members who kept custody of the copies. Neither Bush nor Condoleezza Rice, then his national security advisor, read the entire estimate at the time, and in any case the public relations rollout of the war was already under way before the document was written.

Had Bush read the intelligence community's report, he would have seen his administration's case for invasion stood on its head. The intelligence officials concluded that Saddam was unlikely to use any weapons of mass destruction against the United States or give them to terrorists -- unless the United States invaded Iraq and tried to overthrow his regime. The intelligence community did not believe, as the president claimed, that the Iraqi regime was an ally of al Qaeda, and it correctly foresaw any attempt to establish democracy in a post-Saddam Iraq as a hard, messy slog.

In a separate prewar assessment, the intelligence community judged that trying to build a new political system in Iraq would be "long, difficult and probably turbulent," adding that any post-Saddam authority would face a "deeply divided society with a significant chance that domestic groups would engage in violent conflict with each other unless an occupying force prevented them from doing so." Mentions of Iraqis welcoming U.S. soldiers with flowers, or the war paying for itself, were notably absent. Needless to say, none of that made any difference to the White House.


"Intelligence Failures Have Screwed Up U.S. Foreign Policy."

Hardly. The record of 20th-century U.S. intelligence failures is a familiar one, and mostly indisputable. But whether these failures -- or the successes -- mattered in the big picture is another question.

The CIA predicted both the outbreak and the outcome of the 1967 Six-Day War between Israel and neighboring Arab states, a feat impressive enough that it reportedly won intelligence chief Richard Helms a seat at President Johnson's Tuesday lunch table. Still, top-notch intelligence couldn't help Johnson prevent the war, which produced the basic contours of today's intractable Israeli-Palestinian conflict, and U.S. intelligence completely failed to predict Egypt's surprise attack on Israel six years later. Yet Egypt's nasty surprise in 1973 didn't stop Nixon and Secretary of State Henry Kissinger from then achieving a diplomatic triumph, exploiting the conflict to cement relations with Israel while expanding them with Egypt and the other Arab states -- all at the Soviets' expense.

U.S. intelligence also famously failed to foresee the 1979 Iranian revolution. But it was policymakers' inattention to Iran and sharp disagreements within President Jimmy Carter's administration, not bad intelligence, that kept the United States from making tough decisions before the shah's regime was at death's door. Even after months of disturbances in Iranian cities, the Carter administration -- preoccupied as it was with the Egypt-Israel peace negotiations and the Sandinistas' revolution in Nicaragua -- still had not convened any high-level policy meetings on Iran. "Our decision-making circuits were heavily overloaded," Zbigniew Brzezinski, Carter's national security advisor, later recalled.

Imperfect intelligence analysis about another coming political upheaval -- the collapse of the Soviet Union -- did not matter; the overriding influence on U.S. policy toward the USSR in the 1980s was Ronald Reagan's instincts. From the earliest days of his presidency, the notion that the Soviet Union was doomed to fail -- and soon -- was an article of faith for the 40th president. "The Russians could never win the arms race," he later wrote. "We could outspend them forever."

AFP/Getty Images

"U.S. Intelligence Underestimated al Qaeda Before 9/11."

No, it didn't. Like any terrorist attack, Sept. 11, 2001, was by definition a tactical intelligence failure. But though intelligence officials missed the attack, they didn't miss the threat. Years before 9/11, the intelligence community, especially the CIA, devoted unusually intense attention and effort to understanding Osama bin Laden's organization. The CIA created a special bin Laden-focused unit in early 1996, when al Qaeda was just beginning to take shape as the anti-American, transnational terrorist group we now know. President Bill Clinton stated in 1998 that "terrorism is at the top of the American agenda." He also launched a covert-action program against al Qaeda that included developing plans to capture bin Laden, even before the 1998 bombings of U.S. embassies in Africa.

When Clinton's national security officials handed over duties to their Bush administration successors, they emphasized the threat that would materialize on 9/11. Sandy Berger, the outgoing national security advisor, told Rice, "You're going to spend more time during your four years on terrorism generally and al Qaeda specifically than [on] any other issue." If more was not done in advance of 9/11 to counter the threat, it was because rallying public support for anything like a war in Afghanistan or costly, cumbersome security measures at home would have been politically impossible before terrorists struck the United States.

The most authoritative evidence of the intelligence community's pre-9/11 understanding of the subject is that same February 2001 worldwide threat statement that never mentioned Iraqi nukes or stockpiles of unconventional weapons. Instead it identified terrorism, and al Qaeda in particular, as the No. 1 threat to U.S. security -- ahead of weapons proliferation, the rise of China, and everything else. Bin Laden and his associates, the report said, were "the most immediate and serious threat" and were "capable of planning multiple attacks with little or no warning." It was all too correct.


"Hidebound Intelligence Agencies Refuse to Change."

You'd be surprised. Criticism of U.S. intelligence agencies -- at least the non-paranoid kind -- tends to portray them as stodgy bureaucracies that use their broad mandate for secrecy to shield themselves from the oversight that would make them do their jobs better. But the great majority of effective intelligence reforms have come from inside, not outside.

The organizational charts of the CIA and other U.S. intelligence agencies have undergone frequent and sometimes drastic revision, a recognition of the need to adapt to the rapidly changing world the agencies monitor and analyze. The CIA merged its analytic units covering East and West Germany in expectation of German reunification well before German unity was achieved in 1990. Other measures, such as developing greater foreign-language ability or training analysts in more sophisticated techniques, have been the focus of concentrated attention inside the agencies for years. The most effective, and probably most revolutionary, change in the intelligence community's work on terrorism was the creation of the CIA's Counterterrorist Center in 1986 -- a successful experiment that broke bureaucratic crockery, gathering previously separated collectors, analysts, and other specialists together to work side by side.

Reforms pursued from outside have received more public attention but have accomplished far less. After 9/11, the intelligence community underwent a reorganization when Congress acted on the 9/11 Commission's recommendation to make all spy agencies answerable to a single director of national intelligence. But the move has not, as hoped, unified the intelligence community, instead creating yet another agency sitting precariously atop 16 others. Because both the new director's office and the National Counterterrorism Center -- another commission recommendation -- added to, rather than replaced, existing government functions, they have further confused lines of responsibility. This much was made clear when would-be terrorist Umar Farouk Abdulmutallab tried to blow up a Detroit-bound passenger jet on Christmas Day 2009. The incident led to the same sorts of recriminations as those after 9/11, about information not being collated and dots not being connected -- only this time they were aimed at the 9/11 Commission's own creations.

Tom Williams/Roll Call

"Intelligence Has Gotten Better Since 9/11."

Yes, but not for the reasons you think. Having a veritable blank check for a decade makes a difference, of course. The big post-9/11 boom in the intelligence budget -- which has doubled since 2001, according to the Senate Intelligence Committee -- has at least marginally improved the odds of discovering the next nugget of information that will enable the United States to roll up a major terrorist plot or take down a bad guy.

But it was the dramatic and obvious change in U.S. priorities following 9/11 that made the most difference. Counterterrorism, more than any other intelligence mission, depends on close collaboration with other governments, which have the critical firsthand knowledge, local police, and investigative powers that the United States usually lacks. Prior to 9/11, those governments' willingness to cooperate was often meager, especially when it meant discomfiting local interests. After 9/11, however, U.S. officials could pound on the desks of their foreign counterparts and say, "This time we really mean it." Some results of this sea change -- successes in freezing or seizing terrorists' financial assets, for example -- have been visible. Many others have been necessarily less so. Future success or failure in tracking threats such as anti-U.S. extremism in South Asia will similarly depend more on the state of U.S.-Pakistan relations than on the performance of the bureaucracy back in Washington.

Cooperation among governments' counterterrorism services has often continued despite political differences between governments themselves. Ultimately, however, such cooperation rests on the goodwill the United States enjoys and the health of its relationships around the world. As 9/11 recedes into history, states' willingness to share information is a depleting asset. We appropriately think of intelligence as an important aid to foreign policy, but we also need to remember how much foreign policy affects intelligence.

Michael Williamson/The Washington Post

"Good Intelligence Can Save Us From Bad Surprises."

We wish. Early last February, barely a week before the Arab Spring ended the three-decade presidency of Egypt's Hosni Mubarak, Sen. Dianne Feinstein, chair of the Senate Intelligence Committee, grilled a CIA official in a Capitol Hill hearing room. "The president, the secretary of state, and the Congress are making policy decisions on Egypt, and those policymakers deserve timely intelligence analysis," Feinstein told Stephanie O'Sullivan, then the CIA's associate deputy director. "I have doubts whether the intelligence community lived up to its obligations in this area."

Feinstein was hardly the only one to criticize U.S. intelligence agencies' inability to predict the speed at which the fire lit by Tunisian fruit vendor Mohamed Bouazizi, who immolated himself on Dec. 17, 2010, would spread throughout the Arab world. But all the bureaucratic overhauls and investigative commissions in the world can't change one incontrovertible fact: Many things we would like our intelligence services to know are too complex to model or predict. What the community should be expected to provide -- and, based on the limited publicly available evidence, apparently did provide -- is a strategic understanding of conditions and attitudes that, given the right spark, could ignite into a full-blown revolution.

The most recent recriminations and inquiries are only the latest in a long line dating back to the 1941 surprise attack on Pearl Harbor. The resources devoted to intelligence have increased substantially over the past seven decades, and intelligence agencies are continually looking for ways to improve how they do their business. But no amount of moving around boxes on a flowchart can eliminate unpleasant surprises, and there will always be new challenges -- especially in an age of endlessly proliferating information.

Intelligence can help manage uncertainty, defining its scope and specifying what is known and what is likely to stay unknown. It can distinguish true uncertainty from simple ignorance by systematically assembling all available information, but it cannot eliminate uncertainty and it cannot prevent all surprises, including some big ones. Leaders must accept this reality; they must expect -- and prepare -- to be surprised.

With due acknowledgment to Donald Rumsfeld, it also means expecting unknown unknowns. Not only will we not know all the right answers -- we will not even be asking all the right questions.

Bill Clark/CQ Roll Call