Argument

It's the Occupation, Stupid

Extensive research into the causes of suicide terrorism proves Islam isn't to blame -- the root of the problem is foreign military occupations.

Although no one wants to talk about it, 9/11 is still hurting America. That terrible day inflicted a wound of public fear that easily reopens with the smallest provocation, and it continues to bleed the United States of money, lives, and goodwill around the world. Indeed, America's response to its fear has, in turn, made Americans less safe and has inspired more threats and attacks.

In the decade since 9/11, the United States has conquered and occupied two large Muslim countries (Afghanistan and Iraq), compelled a huge Muslim army to root out a terrorist sanctuary (Pakistan), deployed thousands of Special Forces troops to numerous Muslim countries (Yemen, Somalia, Sudan, etc.), imprisoned hundreds of Muslims without recourse, and waged a massive war of ideas involving Muslim clerics to denounce violence and new institutions to bring Western norms to Muslim countries. Yet Americans still seem strangely mystified as to why some Muslims might be angry about this situation.

In a narrow sense, America is safer today than on 9/11. There has not been another attack on the same scale. U.S. defenses regarding immigration controls, airport security, and the disruption of potentially devastating domestic plots have all improved.

But in a broader sense, America has become perilously unsafe. Each month, there are more suicide terrorists trying to kill Americans and their allies in Afghanistan, Iraq, and other Muslim countries than in all the years before 2001 combined. From 1980 to 2003, there were 343 suicide attacks around the world, and at most 10 percent were anti-American inspired. Since 2004, there have been more than 2,000, over 91 percent against U.S. and allied forces in Afghanistan, Iraq, and other countries.

Yes, these attacks are overseas and mostly focused on military and diplomatic targets. So too, however, were the anti-American suicide attacks before 2001. It is important to remember that the 1995 and 1996 bombings of U.S. troops in Saudi Arabia, the 1998 bombings of the U.S. embassies in Kenya and Tanzania, and the 2000 bombing of the USS Cole in Yemen were the crucial dots that showed the threat was rising prior to 9/11. Today, such dots are occurring by the dozens every month. So why is nobody connecting them?

U.S. military policies have not stopped the rising wave of extremism in the Muslim world. The reason has not been lack of effort, or lack of bipartisan support for aggressive military policies, or lack of funding, or lack of genuine patriotism.

No. Something else is creating the mismatch between America's effort and the results.

For nearly a decade, Americans have been waging a long war against terrorism without much serious public debate about what is truly motivating terrorists to kill them. In the immediate aftermath of the 9/11 attacks, this was perfectly explicable -- the need to destroy al Qaeda's camps in Afghanistan was too urgent to await sober analyses of root causes.

But, the absence of public debate did not stop the great need to know or, perhaps better to say, to "understand" the events of that terrible day. In the years before 9/11, few Americans gave much thought to what drives terrorism -- a subject long relegated to the fringes of the media, government, and universities. And few were willing to wait for new studies, the collection of facts, and the dispassionate assessment of alternative causes. Terrorism produces fear and anger, and these emotions are not patient.

A simple narrative was readily available, and a powerful conventional wisdom began to exert its grip. Because the 9/11 hijackers were all Muslims, it was easy to presume that Islamic fundamentalism was the central motivating force driving the 19 hijackers to kill themselves in order to kill Americans. Within weeks after the 9/11 attacks, surveys of American attitudes show that this presumption was fast congealing into a hard reality in the public mind. Americans immediately wondered, "Why do they hate us?" and almost as immediately came to the conclusion that it was because of "who we are, not what we do." As President George W. Bush said in his first address to Congress after the 9/11 attacks: "They hate our freedoms: our freedom of religion, our freedom of speech, our freedom to vote and assemble and disagree with each other."

Thus was unleashed the "war on terror."

The narrative of Islamic fundamentalism did more than explain why America was attacked and encourage war against Iraq. It also pointed toward a simple, grand solution. If Islamic fundamentalism was driving the threat and if its roots grew from the culture of the Arab world, then America had a clear mission: To transform Arab societies -- with Western political institutions and social norms as the ultimate antidote to the virus of Islamic extremism.

This narrative had a powerful effect on support for the invasion of Iraq. Opinion polls show that for years before the invasion, more than 90 percent of the U.S. public believed that Saddam Hussein was harboring weapons of mass destruction (WMD). But this belief alone was not enough to push significant numbers to support war.

What really changed after 9/11 was the fear that anti-American Muslims desperately wanted to kill Americans and so any risk that such extremists would get weapons of mass destruction suddenly seemed too great. Although few Americans feared Islam before 9/11, by the spring of 2003, a near majority -- 49 percent -- strongly perceived that half or more of the world's 1.4 billion Muslims were deeply anti-American, and a similar fraction also believed that Islam itself promoted violence. No wonder there was little demand by congressional committees or the public at large for a detailed review of intelligence on Iraq's WMD prior to the invasion.

The goal of transforming Arab societies into true Western democracies had powerful effects on U.S. commitments to Afghanistan and Iraq. Constitutions had to be written; elections held; national armies built; entire economies restructured. Traditional barriers against women had to be torn down. Most important, all these changes also required domestic security, which meant maintaining approximately 150,000 U.S. and coalition ground troops in Iraq for many years and increasing the number of U.S. and Western troops in Afghanistan each year from 2003 on.

Put differently, adopting the goal of transforming Muslim countries is what created the long-term military occupation of Iraq and Afghanistan. Yes, the United States would almost surely have sought to create a stable order after toppling the regimes in these countries in any case. However, in both, America's plans quickly went far beyond merely changing leaders or ruling parties; only by creating Western-style democracies in the Muslim world could Americans defeat terrorism once and for all.

There's just one problem: We now know that this narrative is not true.

New research provides strong evidence that suicide terrorism such as that of 9/11 is particularly sensitive to foreign military occupation, and not Islamic fundamentalism or any ideology independent of this crucial circumstance. Although this pattern began to emerge in the 1980s and 1990s, a wealth of new data presents a powerful picture.

More than 95 percent of all suicide attacks are in response to foreign occupation, according to extensive research that we conducted at the University of Chicago's Project on Security and Terrorism, where we examined every one of the over 2,200 suicide attacks across the world from 1980 to the present day. As the United States has occupied Afghanistan and Iraq, which have a combined population of about 60 million, total suicide attacks worldwide have risen dramatically -- from about 300 from 1980 to 2003, to 1,800 from 2004 to 2009. Further, over 90 percent of suicide attacks worldwide are now anti-American. The vast majority of suicide terrorists hail from the local region threatened by foreign troops, which is why 90 percent of suicide attackers in Afghanistan are Afghans.

Israelis have their own narrative about terrorism, which holds that Arab fanatics seek to destroy the Jewish state because of what it is, not what it does. But since Israel withdrew its army from Lebanon in May 2000, there has not been a single Lebanese suicide attack. Similarly, since Israel withdrew from Gaza and large parts of the West Bank, Palestinian suicide attacks are down over 90 percent.

Some have disputed the causal link between foreign occupation and suicide terrorism, pointing out that some occupations by foreign powers have not resulted in suicide bombings -- for example, critics often cite post-World War II Japan and Germany. Our research provides sufficient evidence to address these criticisms by outlining the two factors that determine the likelihood of suicide terrorism being employed against an occupying force.

The first factor is social distance between the occupier and occupied. The wider the social distance, the more the occupied community may fear losing its way of life. Although other differences may matter, research shows that resistance to occupations is especially likely to escalate to suicide terrorism when there is a difference between the predominant religion of the occupier and the predominant religion of the occupied.

Religious difference matters not because some religions are predisposed to suicide attacks. Indeed, there are religious differences even in purely secular suicide attack campaigns, such as the LTTE (Hindu) against the Sinhalese (Buddhists).

Rather, religious difference matters because it enables terrorist leaders to claim that the occupier is motivated by a religious agenda that can scare both secular and religious members of a local community -- this is why Osama bin Laden never misses an opportunity to describe U.S. occupiers as "crusaders" motivated by a Christian agenda to convert Muslims, steal their resources, and change the local population's way of life.

The second factor is prior rebellion. Suicide terrorism is typically a strategy of last resort, often used by weak actors when other, non-suicidal methods of resistance to occupation fail. This is why we see suicide attack campaigns so often evolve from ordinary terrorist or guerrilla campaigns, as in the cases of Israel and Palestine, the Kurdish rebellion in Turkey, or the LTTE in Sri Lanka.

One of the most important findings from our research is that empowering local groups can reduce suicide terrorism. In Iraq, the surge's success was not the result of increased U.S. military control of Anbar province, but the empowerment of Sunni tribes, commonly called the Anbar Awakening, which enabled Iraqis to provide for their own security. On the other hand, taking power away from local groups can escalate suicide terrorism. In Afghanistan, U.S. and Western forces began to exert more control over the country's Pashtun regions starting in early 2006, and suicide attacks dramatically escalated from this point on.

The research suggests that U.S. interests would be better served through a policy of offshore balancing. Some scholars have taken issue with this approach, arguing that keeping boots on the ground in South Asia is essential for U.S. national security. Proponents of this strategy fail to realize how U.S. ground forces often inadvertently produce more anti-American terrorists than they kill. In 2000, before the occupations of Iraq and Afghanistan, there were 20 suicide attacks around the world, and only one (against the USS Cole) was directed against Americans. In the last 12 months, by comparison, 300 suicide attacks have occurred, and over 270 were anti-American. We simply must face the reality that, no matter how well-intentioned, the current war on terror is not serving U.S. interests.

The United States has been great in large part because it respects understanding and discussion of important ideas and concepts, and because it is free to change course. Intelligent decisions require putting all the facts before us and considering new approaches. The first step is recognizing that occupations in the Muslim world don't make Americans any safer -- in fact, they are at the heart of the problem.

Eric J. Tilford/U.S. Navy/Getty Images

Argument

Five Zombie Economic Ideas That Refuse to Die

Two years after the financial crisis, the U.S. economy has steered clear of total disaster, with the Dow Jones industrial average currently near its pre-crash level. But the theories that caused it all are still out there, lurking in the shadows.

The global financial crisis that began with the collapse of the U.S. subprime mortgage market in 2007 ended by revealing that most of the financial enterprises that had dominated the global economy for decades were speculative ventures that were, if not insolvent, at least not creditworthy.

Much the same can be said of many of the economic ideas that guided policymakers in the decades leading up to the crisis. Economists who based their analysis on these ideas contributed to the mistakes that caused the crisis, failed to predict it or even recognize it when it was happening, and had nothing useful to offer as a policy response. If one thing seemed certain, it was that the dominance of the financial sector, as well as of the ideas that gave it such a central role in the economy, was dead for good.

Three years later, however, the banks and insurance companies bailed out on such a massive scale by governments (and ultimately the citizens who must pay higher taxes for reduced services) have returned, in zombie form. The same reanimation process has taken place in the realm of ideas. Theories, factual claims, and policy proposals that seemed dead and buried in the wake of the crisis are now clawing their way through the soft earth, ready to wreak havoc once again.

Five of these zombie ideas seem worthy of particular attention and, if possible, final burial. Together they form a package that may be called "market liberalism," or, more pejoratively "neoliberalism." Market liberalism dominated public policy for more than three decades, from the 1970s to the global financial crisis. Even now, it dominates the thinking of the policymakers called on to respond to its failures. The five ideas are:

The Great Moderation: the idea that the period beginning in 1985 was one of unparalleled macroeconomic stability that could be expected to endure indefinitely.

Even when it was alive, this idea depended on some dubious statistical arguments and a willingness to ignore the crises that afflicted many developing economies in the 1990s. But the Great Moderation was too convenient to cavil at.

Of all the ideas I have tried to kill, this one seems most self-evidently refuted by the crisis. If double-digit unemployment rates and the deepest recession since the 1930s don't constitute an end to moderation, what does? Yet academic advocates of the Great Moderation hypothesis, such as Olivier Coibion and Yuriy Gorodnichenko, have stuck to their guns, calling the financial crisis a "transitory volatility blip."

More importantly, central banks and policymakers are planning a return to business as usual as soon as the crisis is past. Here, "business as usual" means the policy package of central bank independence, inflation targeting, and reliance on interest rate adjustments that have failed so spectacularly in the crisis. Speaking at a symposium for the 50th anniversary of the Reserve Bank of Australia this year, European Central Bank head Jean-Claude Trichet offered the following startlingly complacent analysis:

We are emerging from the uncharted waters navigated over the past few years. But as central bankers we are always faced with new episodes of turbulence in the economic and financial environment. While we grapple with how to deal with ever new challenges, we must not forget the fundamental tenets that we have learned over the past decades. Keeping inflation expectations anchored remains of paramount importance, under exceptional circumstances even more than in normal times. Our framework has been successful in this regard thus far.

The Efficient Markets Hypothesis: the idea that the prices generated by financial markets represent the best possible estimate of the value of any investment. (In the version most relevant to public policy, the efficient markets hypothesis states that it is impossible to outperform market valuations on the basis of any public information.)

Support for the efficient markets hypothesis has always relied more on its consistency with free market ideas in general than on clear empirical evidence.

The absurdities of the late 1990s dot-com bubble and bust ought to have killed the notion. But, given the financial sector's explosive growth and massive profitability in the early 2000s, the hypothesis was too convenient to give up.

Some advocates developed elaborate theories to show that the billion-dollar values placed on companies delivering dog food over the Internet were actually rational. Others simply treated the dot-com bubble as the exception that proves the rule.

Either way, the lesson was the same: Governments should leave financial markets to work their magic without interference. That lesson was followed with undiminished faith until it came to the edge of destroying the global economy in late 2008.

Even now, however, when the efficient financial markets hypothesis should be discredited once and for all, and when few are willing to advocate it publicly, it lives on in zombie form. This is most evident in the attention paid to ratings agencies and bond markets in discussion of the "sovereign debt crisis" in Europe, despite the fact that it was the failure of these very institutions, as well as the speculative bubble they helped generate, that created the crisis in the first place.

Dynamic Stochastic General Equilibrium (DSGE): the idea that macroeconomic analysis should not be concerned with observable realities like booms and slumps, but with the theoretical consequences of optimizing behavior by perfectly rational (or almost perfectly rational) consumers, firms, and workers.

DSGE macro arose out of the breakdown of the economic synthesis that informed public policy in the decades after World War II, which combined Keynesian macroeconomics with neoclassical microeconomics. In the wake of the stagflation of the 1970s, critics of John Maynard Keynes like University of Chicago economist Robert Lucas argued that macroeconomic analysis of employment and inflation could only work if it were based on the same microeconomic foundations used to analyze individual markets and the way these markets interacted to produce a general equilibrium.

The result was a thing of intellectual beauty, compared by the IMF's chief economist, Olivier Blanchard, to a haiku. By adding just the right twists to the model, it was possible to represent booms and recessions, at least on the modest scale that prevailed during the Great Moderation, and derive support for the monetary policy.

But when the crisis came, all this sophistication proved useless. It was not just that DSGE models failed to predict the crisis. They also contributed nothing to the discussion of policy responses, which has all been conducted with reference to simple Keynesian and classical models that can be described by the kinds of graphs found in introductory textbooks.

Economist Paul Krugman and others have written that the profession has mistaken beauty for truth. We need macroeconomic analysis that is more realistic, even if it is less rigorous. But the supertanker of an academic research agenda is hard to turn, and the DSGE approach has steamed on, unaffected by its failure in practice. Google Scholar lists 2,600 articles on DSGE macro published since 2009, and many more are on the way.

The Trickle-Down Hypothesis: the idea that policies that benefit the wealthy will ultimately help everybody.

Unlike some of the zombie ideas discussed here, trickle-down economics has long been with us. The term itself seems to have been coined by cowboy performer Will Rogers, who observed of U.S. President Herbert Hoover's 1928 tax cuts: "The money was all appropriated for the top in the hopes that it would trickle down to the needy. Mr. Hoover ... [didn't] know that money trickled up."

Trickle-down economics was conclusively refuted by the experience of the postwar economic golden age. During this "Great Compression," massive reductions in inequality brought about by strong unions and progressive taxes coexisted with full employment and sustained economic growth.

Whatever the evidence, an idea as convenient to the rich and powerful as trickle-down economics can't be kept down for long. As inequality grew in the 1980s, supply-siders and Chicago school economists promised that, sooner or later, everyone would benefit. This idea gained more support during the triumphalist years of the 1990s, when, for the only time since the breakdown of Keynesianism in the 1970s, the benefits of growth were widely spread, and when stock-market booms promised to make everyone rich.

The global financial crisis marks the end of an economic era and provides us with a position to survey how the benefits of economic growth have been shared since the 1970s. The answers are striking. Most of the benefits of U.S. economic growth went to those in the top percentile of the income distribution. By 2007, just one out of 100 Americans received nearly a quarter of all personal income, more than the bottom 50 percent of households put together.

The rising tide of wealth has conspicuously failed to lift all boats. Median household income has actually declined in the United States over the last decade and has been stagnant since the 1970s. Wages for males with a high school education have fallen substantially over the same period.

Whatever the facts, there will always be plenty of advocates for policies that favor the rich. Economics commentator Thomas Sowell provides a fine example, observing, "If mobility is defined as being free to move, then we can all have the same mobility, even if some end up moving faster than others and some of the others do not move at all."

Translating to the real world, if we observe one set of children born into a wealthy family, with parents willing and able to provide high-quality schooling and "legacy" admission to the Ivy League universities they attended, and another whose parents struggle to put food on the table, we should not be concerned that members of the first group almost invariably do better. After all, some people from very disadvantaged backgrounds achieve success, and there was no law preventing the rest from doing so.

Contrary to the cherished beliefs of most Americans, the United States has less social mobility than any other developed country. As Ron Haskins and Isabel Sawhill of the Brookings Institution have shown, 42 percent of American men with fathers in the bottom fifth of the income distribution remain there as compared to: Denmark, 25 percent; Sweden, 26 percent; Finland, 28 percent; Norway, 28 percent; and Britain, 30 percent. The American Dream is fast becoming a myth.

Privatization: the idea that nearly any function now undertaken by government could be done better by private firms.

The boundaries between the private and public sectors have always shifted back and forth, but the general tendency since the late 19th century has been for the state's role to expand, to correct the limitations and failures of market outcomes. Beginning with Prime Minister Margaret Thatcher's government in 1980s Britain, there was a concerted global attempt to reverse this process. The theoretical basis for privatization rested on the efficient markets hypothesis, according to which private markets would always yield better investment decisions and more efficient operations than public-sector planners.

The political imperative derived from the "fiscal crisis of the state" that arose when the growing commitments of the welfare state ran into the end of the sustained economic growth on which it was premised. The crisis manifested itself in the "tax revolts" of the 1970s and 1980s, epitomized by California's Proposition 13, the ultimate source of the state's current crisis.

Even in its heyday, privatization failed to deliver on its promises. Public enterprises were sold at prices that failed to recompense governments for the loss of their earnings. Rather than introducing a new era of competition, privatization commonly replaced public monopolies with private monopolies, which have sought all kinds of regulatory arbitrage to maximize their profits. Australia's Macquarie Bank, which specializes in such monopoly assets and is known as the "millionaires' factory," has shown particular skill in jacking up prices and charges in ways not anticipated by governments undertaking privatization.

Privatization failed even more spectacularly in the 21st century. A series of high-profile privatizations, including those of Air New Zealand and Railtrack in Britain, were reversed. Then, in the chaos of the global financial crisis, giants like General Motors and American International Group (AIG) sought the protection of government ownership.

Sensible proponents of the mixed economy have never argued that privatization should be opposed in all cases. As circumstances change, government involvement in some areas of the economy becomes more desirable, in others less so. But the idea that change should always be in the direction of greater private ownership deserves to be consigned to the graveyard of dead ideas.

Despite being spectacularly discredited by the global financial crisis, the ideas of market liberalism continue to guide the thinking of many, if not most, policymakers and commentators. In part, that is because these ideas are useful to rich and powerful interest groups. In part, it reflects the inherent tenacity of intellectual commitments.

Most importantly, though, the survival of these zombie ideas reflects the absence of a well-developed alternative. Economics must take new directions in the 21st century if we are to avoid a repetition of the recent crisis.

Most obviously, there needs to be a shift from rigor to relevance. The prevailing emphasis on mathematical and logical rigor has given economics an internal consistency that is missing in other social sciences. But there is little value in being consistently wrong.

Similarly, there needs to be a shift from efficiency to equity. Three decades in which market liberals have pushed policies based on ideas of efficiency and claims about the efficiency of financial markets have not produced much in the way of improved economic performance, but they have led to drastic increases in inequality, particularly in the English-speaking world. Economists need to return their attention to policies that will generate a more equitable distribution of income.

Finally, with the collapse of yet another economic "new era," it is time for the economics profession to display more humility and less hubris. More than two centuries after Adam Smith, economists have to admit the force of Socrates's observation that "The wisest man is he who knows that he knows nothing."

Every crisis is an opportunity. The global financial crisis gives the economics profession the chance to bury the zombie ideas that led the world into crisis and to produce a more realistic, humble, and above all socially useful body of thought.

Robert Giroux/Getty Images