Argument

The American Mongols

To win the war against terrorism, the United States must overcome the burden of history.

An invading army is marching toward Baghdad -- again. The last time infidels conquered the City of Peace was in 1258, when the Mongol horde, led by Genghis Khan's grandson Hulegu, defeated the Arab Abbasid caliphate that had ruled for more than five centuries. And if the ripple effects of that episode through Islam's history are any guide, the latest invasion of Iraq will unleash a new cycle of hatred -- unless the United States can find ways to bolster the credibility of moderate Islamic thinkers.

Saddam Hussein, who has led Iraq's Baathist socialist regime for nearly 25 years, is no caliph. The U.S. military has come as self-declared liberators, not as conquerors. Yet the U.S. invasion of Iraq resonates strongly with fundamentalist Muslims because they see Saddam's downfall -- and the broader humiliation of the Arab world at the hands of the latter-day Mongols -- as righteous punishment. Since the 13th century, Islamic theologians have argued that military defeat at the hands of unbelievers results when Muslims embrace pluralism and worldly knowledge. The story is drilled into Muslim children from Morocco to Indonesia: nearly 2 million people put to the sword; the caliph trampled to death; and the destruction of the great library, the House of Wisdom. The Ottoman Empire fell in 1918 for the same reason Muslims lost Baghdad in 1258: The rulers and their people had gone soft, approaching religion with tolerance and accommodation rather than viewing civilization as divided between Islam and infidels.

The U.S.-led invasion of secular Iraq is the ultimate vindication of this worldview, the capstone of a series of modern Muslim defeats that began with the first Gulf War and continued through the next decade with the Serbs' ethnic cleansing campaigns against Muslims in Kosovo and Bosnia and Herzegovina, the repression of Islamist groups in Algeria and Egypt, Russia's brutal military campaign against Chechen separatists, and the defeat of the Taliban in Afghanistan. Islamists see these cataclysmic events as opportunities to purify Muslim souls and to prepare for an ideological battle with the West.

Fundamentalists believe they have every reason to anticipate victory in this battle, because the story of the Mongol conquest of Baghdad didn't end in 1258. The Egyptian Mamluks were able to halt the tide of Mongol victories in the Battle of Ayn Jalut in Palestine two years later. In less than a century, the Mongol conquerors themselves converted to Islam, and Islamic power resurged in Turkey and India after being dislodged from the Arabian heartland. The lesson, according to Islamists, is that even the defeat of Muslims has a place in God's scheme for Islam's eventual supremacy in the world.

In addition to the historical narrative, Muslim fundamentalists also have prophecies about the apocalypse attributed to the Prophet Mohammed to buttress their cause. These signs are described in hadith, the sayings of Mohammed passed down through oral tradition before being recorded at least 100 years after his death. One hadith that has currently captured the attention of fundamentalists is "The hour [of the world's end] shall not occur until the Euphrates will disclose a mountain of gold over which people will fight." The "mountain of gold" could be a metaphor for a valuable natural resource such as oil, and "the Euphrates" may refer to Iraq, where the river flows. Just as some Christian fundamentalists saw the creation of the state of Israel as fulfillment of biblical prophecy heralding the Day of Judgment, so too will some Muslim fundamentalists interpret the U.S. occupation of Iraq as setting the stage for the final battle between good, led by Mahdi (the rightly guided), and evil, represented by Dajjal (the deceiver).

Armed with prophecy and history, Islamist movements see the humiliation of fellow believers as an opportunity for mobilizing and recruiting dedicated followers. Muslims have often resorted to asymmetric warfare in the aftermath of military defeat. Palestinian leader Yasir Arafat and his Fatah movement captured the imagination of young Palestinians only after Arabs lost the Six-Day War and East Jerusalem in 1967. Islamic militancy in Kashmir can be traced to India's military victory over Pakistan in the 1971 Bangladesh war. Revenge, rather than willingness to compromise or submit to the victors, is the traditional response of theologically inclined Muslims to the defeat of Muslim armies. And for the Islamists, this battle has no front line and is not limited to a few years, or even decades. They think in terms of conflict spread over generations. A call for jihad against British rule in India, for example, resulted in an underground movement that lasted from 1830 to the 1870s, with remnants periodically surfacing well into the 20th century.

This fundamentalist interpretation of Islam has failed to penetrate the thinking of most Muslims, especially in recent times. But religious hard-liners can drive the political agenda in Muslim countries, just as Christian and Jewish fundamentalists have become a force to reckon with in secular nations such as the United States. And with over 1 billion Muslims around the globe, the swelling of the fundamentalist ranks poses serious problems for the West. If only 1 percent of the world's Muslims accept uncompromising theology, and 10 percent of that 1 percent decide to commit themselves to a radical agenda, the recruitment pool for al Qaeda comes to 1 million.

Suspicions about Western intentions date back to the British, who came as friends during World War I and ended up colonizing and dividing Arab lands. Thus, the Americans face the difficult task of overcoming Muslim mistrust. The United States must avoid any impulse to act as an imperial power, dictating its superior ways to "less civilized" peoples. It should be prepared to accept Islamic pride and Arab nationalism as factors in the region's politics, instead of backing narrowly based elites to do its bidding. Patient engagement, rather than the flaunting of military and financial power, should characterize this new phase of U.S. intervention in the heart of the Islamic world.

If U.S. President George W. Bush's promises of democracy in Iraq and a Palestinian state are not kept and if the United States fails to demand reforms in countries ruled by authoritarian allies, the umma (community of believers) would have new reasons to distrust and hate. The dream of helping Muslims overcome their fear of modernity will then remain unfulfilled. And the world will continue to confront new jihads.

Argument

Downside Danger

Why the world's central banks must become more vigilant about falling prices.

If a single proposition unites central bankers these days, it is the belief that price stability -- in practice, a low and stable rate of inflation -- is the bedrock of sound monetary policy. To someone with only a passing knowledge of monetary and economic history, this idea may seem unprogressive, if not downright Victorian. In fact, its validity has been demonstrated, painfully, many times.

There is now a consensus among economic historians that a particular form of price instability -- deflation, or falling prices -- was a principal cause of the Great Depression. And nearly all economists agree that the inflationary surge in the United States, the United Kingdom, and several other countries from the late 1960s through the early 1980s was an important source of the economic volatility, slow growth, and high unemployment that characterized those years.

Determined to avoid a repeat of the Great Inflation of the 1970s, central bankers around the world have worked hard over the last two decades to achieve price stability. The U.S. Federal Reserve has reduced inflation from more than 13 percent in 1979 to the low single digits today. As part of the anti-inflation campaign, many central banks have adopted quantitative inflation objectives, including several banks, such as the European Central Bank (ECB), that do not formally classify themselves as "inflation targeters."

Nearly all industrialized nations currently have inflation rates of around 2 percent, the important exception being Japan, which is experiencing a mild deflation. Even regions traditionally prone to high inflation have substantially reduced their inflation rates. For instance, many countries in Latin America now boast rates well below 10 percent; very few have rates above 20 or 30 percent, a once common level.

Low and stable inflation in many countries is an important accomplishment that will continue to bring significant benefits. But de facto price stability has had another effect, which is now forcing central bankers, as well as the public, to fundamentally rethink inflation.

After a long period in which the desired direction for inflation was always downward, the industrialized world's central banks must today try to avoid major changes in the inflation rate in either direction. In central bank speak, we now face "symmetric" inflation risks. The Federal Reserve recognized the changed circumstances in a statement issued following the May 6, 2003 meeting of its policymaking arm, the Federal Open Market Committee (FOMC). The FOMC explicitly recognized that both upside and downside risks to inflation can exist and said the greater risk at this moment is on the downside. It was the first time in decades, if not ever, that the central bank has voiced concern that inflation might fall too low.

Why would very low inflation -- say, below 1 percent -- or actual deflation (negative inflation, or falling prices) possibly hurt the economy?

The potential harm of very low inflation or deflation depends on the economic environment. Deflation can be particularly dangerous when a financial system is shaky, with household and corporate balance sheets in poor shape and banks undercapitalized and heavily burdened with bad loans. Under such conditions, deflation increases the real burden of debts -- that is, it forces borrowers to repay in dollars that are more expensive than the dollars they borrowed -- and may exacerbate the financial distress. (Unexpectedly low inflation has a similar effect.) This phenomenon, known as "debt deflation," factored prominently in the global economic turmoil of the 1930s and may have played an important role in Japan's recent troubles. Fortunately, the United States does not appear at risk of suffering a similar financial setback. American households and firms have done an excellent job in recent years of restructuring their balance sheets, and U.S. banks are well capitalized and profitable.

Although the U.S. financial system is sound, circumstances exist under which deflation or very low inflation might still conceivably pose a significant threat to the economy. The potential problem could arise when aggregate spending by households and firms is insufficient to sustain strong economic growth, even when the short-term real interest rate (the market, or nominal, interest rate minus the rate of inflation) is zero or negative.

When aggregate demand is that weak, deflation or very low inflation places a lower limit on the real interest rate that can be engineered by monetary policymakers -- in other words, it hinders the ability of a central bank to stimulate growth. That is because the nominal rate of interest cannot go below zero. No one will lend at a negative interest rate; potential creditors will simply choose to hold cash, which pays zero nominal interest.

The U.S. economy appears to be rebounding, in part because the Federal Reserve has kept interest rates low. I expect the recovery to continue. But were it to falter -- say, because firms cut back on new investment -- then the scenario just described might become relevant. Specifically, if spending and output growth next year proved insufficient to absorb the slack in labor and product markets, we might see further reductions in inflation, which would further restrict the Federal Reserve's already limited ability to lower the short-term real interest rate.

There are other ways the central bank can stimulate the economy. For example, it can purchase a broad range of financial assets, thereby pumping additional liquidity into the economy. However, the Federal Reserve has less experience using these methods and in predicting their effects, so implementing them would not be without cost. Hence, allowing inflation to fall too low -- low enough to where it might morph into actual deflation -- would be highly undesirable from the point of view of the Federal Reserve, or any other central bank for that matter.

In short, inflation can be too high, but it can also be too low. So what level of inflation is just right -- what, if you will, is the "Goldilocks" level? The best-case scenario is when inflation is neither so high as to impede economic efficiency and growth nor so low that the nominal short-term interest rate routinely flirts with zero. What that ideal inflation rate is depends on the individual economy and on the views and preferences of policymakers.

Although the "just right" inflation rate for the U.S. economy remains an open question, much recent research suggests that it is around 2 percent. Japan's negative inflation rate is clearly too low for the country's economic health. Until recently, the ECB's inflation objective was a rate below 2 percent, leading some observers to worry that the bank was perhaps giving insufficient attention to the downward risks to inflation. Lately, however, the ECB has stated that it intends to keep inflation near 2 percent, which suggests that it is now taking a more symmetric view of inflation risks.

The conquest of inflation is an important victory for the world's central banks and a critical factor behind the improvement in economic performance over the last two decades. To continue to promote economic growth and stability in coming decades, monetary authorities will need to exercise the same vigilance with respect to the downside risks to inflation.