Argument

Is Afghanistan 'Medieval'?

Afghans shouldn't be insulted when Westerners say the country reminds them of the Middle Ages. The religious and political struggles of that era can offer some useful lessons.

In July 1973, Afghanistan's King Mohammed Zahir Shah was overthrown by his cousin Daud, who then abolished the monarchy and declared himself the president of a republic. The New York Times sarcastically editorialized that Afghanistan had just "leaped into the sixteenth century." Radio reports soon brought news of this slight even to provincial northern Afghanistan, where I was working at the time. Daud's government in Kabul expressed its displeasure, but an Afghan friend familiar with the region's complex history saw it differently. "We may have acted hastily," he joked. "The 15th century was pretty good around here!" Indeed, the Timurid dynasty that had its capital in Herat during that period was internationally renowned for its fine arts, monumental architecture, classical poetry -- and effective governance.

I was reminded of this story last month when the Afghan government accused Britain's new defense minister, Liam Fox, of insulting Afghanistan by describing it as a "broken 13th-century country." One Afghan official told the London Times that Fox's comments "show a lack of trust" and prove that Britain is a "colonial, orientalist, and racist country."

But Fox was hardly the first Westerner to reach for the medieval analogy when attempting to get a handle on Afghanistan. Something about Afghanistan conjures up the medieval period in the Western mind in an unreflective way, if only to express the idea that "they are not like us." For some it is a simple insult. Former Blackwater CEO Erik Prince declared that the Taliban were "barbarians" who "crawled out of the sewer" with "a 1200 A.D. mentality." (Given Prince's own fixation on the medieval Christian crusaders of the same era, perhaps the Taliban aren't the only ones with that mentality.) Yet medieval Europe, where religion still played central role in culture and politics and state power was highly fragmented, isn't the worst analogy for understanding contemporary Afghanistan. And Europe's experience during this period might even provide some useful lessons for the country going forward.

Secular Westerners who spend any time in rural Afghanistan are struck by the continuing power of religion there. Islam still permeates all aspects of everyday social relations in rural society; nothing is separate from it. Its influence is ever present in people's ordinary conversations, business transactions, dispute resolutions, and moral judgments. There is no relationship, whether political, economic, or social, that is not validated by Islam. In such a society it is impossible to separate religion from politics. Rural Afghans cannot even conceive of the separation of religion and government because in their minds, the two are so intrinsically linked. The declaration of Afghanistan as an "Islamic Republic" upon the fall of the Taliban provoked neither domestic discussion nor concern. Any regime in Kabul that does not seize the Islamic banner for itself is vulnerable to being branded as illegitimate by its enemies, as the Soviet-backed communist government learned during the 1980s.

Christianity once played a similar all-encompassing role in medieval European life, and it took many centuries (and the emergence of rationalist secular ideologies beginning with the Enlightenment) for church and state to disentangle themselves. By the mid-20th century, Joseph Stalin could derisively ask how many divisions the pope had, but no medieval ruler could afford to be as cavalier when his legitimacy was challenged by the pontiff, even a venal one.

The Catholic Church could call on the faithful to disobey their rulers, or to crusade against non-Christian states or against declared sectarian heretics within the faith. The fear of damnation preoccupied the lives of ordinary people. It instilled a respect for the power of religious authorities and induced them to donate lavishly to build churches or support monasteries. The church could also set rules prohibiting the lending of money with interest and demand religious tithes. Because the rise of the modern West was characterized by the long-term retreat of religion as the preeminent force in society, it now takes a different leap of faith to appreciate a society in which faith continues to dominate.

Rural Afghanistan remains such a place. The medieval analogy is not an exact one, of course. Afghanistan's Sunni Islam never had an institutionalized clerical hierarchy, monasteries, or religious figures with the power of a pope. But if we are talking about a cultural ethos, the analogy is not half bad -- particularly when the Taliban attempts to frame its opposition to the Kabul government as a jihad or holy war.

Western observers attempting to come to grips with Afghanistan's fragmented state authority also find themselves drawn to other aspects of the medieval era, a period in which leadership in Europe was personal rather than bureaucratic and the state's power to impose its will quite limited. In the wake of the collapse of the Western Roman Empire in the fifth century, a form of feudalism emerged that was highly decentralized. Monarchs devolved power by granting lands to regional leaders who were then obligated to provide military and political support to their superiors when asked.

The difficulty for a monarch was that the resources remained in the hands of his vassals, who then acted in their own interests. The rise of centralized states in Europe in the 16th through 18th centuries finished a process by which monarchs gradually centralized power and dispossessed their feudal nobilities. Then, in the beginning of the 19th century, the rise of the European nation-state took this process a step further and dispossessed the monarchs while keeping intact the centralized administrations they had built.

Afghanistan's history has followed a surprisingly similar pattern -- just delayed by a few centuries. Until the 17th century, the territories that comprise modern Afghanistan were peripheral parts of powerful regional empires centered in Central Asia, India, and Iran. Upon their decline or collapse in the mid-18th century, Afghanistan reverted to a feudal structure in which Afghan amirs gave land grants to vassals in exchange for military service. They also reached accommodations with various autonomous groups in hard-to-rule regions that rejected centralized rule outright.

The British dismantled this typically feudal structure when they invaded Afghanistan during the First Anglo-Afghan War (1839-1842). They lost that war, but successive Afghan amirs kept their reforms and their goal of centralized state power. By the end of the century, Amir Abdur Rahman (1880-1901) used modern weapons and a powerful army to establish the modern Afghan nation-state, one that ruled the country without intermediaries. For the last century, however, successive governments in Kabul that attempted to maintain Abdur Rahman's legacy of centralization experienced total state collapse at least three times, in 1929, 1992, and 2001. Power once gained devolved back to regional political leaders, particularly during the long civil war in the 1990s, a process the Taliban did little to reverse.

Foreigners encountering Afghanistan in the post-2001 era saw this devolution of power as an example of state failure. Many of the multiple competitors for legitimate authority at the local level had no desire to participate in politics in a state-centered system. Autonomous tribes and ethnic groups, local militia commanders, criminal syndicates, and even blood-feuding families sought to resist state power, but they did not seek to overturn or replace it. This arrangement is analogous to medieval Europe, where kings were frequently also unable to maintain a monopoly on the legitimate use of violence, but were still able to retain their thrones.

The weakness of President Hamid Karzai that has led many journalists to dub him the "Mayor of Kabul" is little different structurally from those medieval European kings, who also held their capitals but did not rule their people. Similarly, Karzai's adoption of a patrimonial model of the state, in which offices and resources are redistributed on a personal basis to buy the support of existing power-holders or play them off one another, has more in common with the Holy Roman Empire than the European Union. In some ways, therefore, a thorough understanding of medieval power politics and how rulers came to centralize state authority would be of greater value to the international advisors sent to the Karzai government than a background in constitutional law or regulatory reform. At least in medieval Europe, the centralized state emerged victorious.

BANARAS KHAN/AFP/Getty Images

Argument

Hold Your Schadenfreude

If Europe's economy goes down, it's taking America's with it.

Although the crisis in the international credit markets today is not yet as severe as it was in 2007 and 2008, there are mounting signs that Europe's debt troubles are producing another global credit crunch. A top U.S. Federal Reserve official has already felt compelled to warn of dangers facing the American economy, and President Obama has said that he's monitoring the situation closely.

But what, exactly, does Europe's mess have to do with U.S. credit markets? Understanding that link is ultimately going to be the key to deciding what to do about it.

The root of the problem is the fact that Europe's banks desperately need American dollars. It's a dependency that traces back to the early 2000s, when, in the pursuit of big profits, European banks threw caution to the winds in greatly increasing their holdings of dollar-denominated assets. The total value of those investments went from $10 trillion in 2000 to $34 trillion by the end of 2007.

But those assets could only be purchased through borrowing: While American banks can fund dollar-asset purchases using bank account deposits from ordinary Americans, European banks have most of their deposits in euros or other domestic currency. So they need to find greenbacks from external sources to fill this "dollar funding gap" and fuel their dollar-denominated investments.

There are several places foreign banks go to borrow dollars, but among the primary providers of dollar funding are U.S. money-market funds, those "safe-as-savings-account" investments where millions of Americans have stashed their hard-earned cash. Prior to the subprime crisis that began three years back, money markets had lent roughly $1 trillion to European banks.

Money-market funds typically offer very short-term loans to borrowers, expecting their money back (plus interest) in 30 days or less. That might have posed a problem for European banks, since they were mostly dealing in medium- to long-term dollar assets, but the Europeans got accustomed to simply "rolling over" their loans. Essentially, they borrowed to make the initial investment -- and then borrowed more to pay off the first debt when it came due, doing the same for each successive debt thereafter. Once the original investment matured, or so the theory went, the bank would cash out, pay off its last dollar debt to the money-market fund or whomever, and do with its profit what it wished.

European banks have spent the past two years reducing their stock of dollar-denominated investments, but they're still holding huge dollar assets that haven't yet matured. That's why they still rely on rolling over short-term dollar loans.

So long as credit markets are functioning normally, the "roll over" strategy is a great plan. But, if money-market funds and other dollar-funding sources become unwilling to lend, then European banks have bills due that they can't pay. That is precisely where the global financial system found itself three years ago and, unfortunately, it's where it appears to be heading today.

Beginning in the summer of 2007 and peaking in the fall of 2008, money-market funds became reluctant and then unwilling to lend to banks because they didn't know which ones might collapse under subprime losses. Today, their reluctance has nothing to do with bad mortgages, but rather a general belief that the $1 trillion European bailout package announced on May 9 will not solve the problem of bank exposure to government debt markets. But the basic logic is the same: European banks have invested heavily in toxic assets that one day may not be worth the paper they are written on. The money-market folks understandably fear they may never get their dollars back if they lend to one of these exposed banks.

This renewed sense of risk has led to a feeling of déjà vu for money-market funds over the past several weeks. The amount of dollars the funds are lending has fallen to levels not seen in more than a decade. And, when they are willing to lend, they want their money back much sooner than normal. These developments are making dollar funding harder and harder to come by, particularly for European banks.

But why should the U.S. care if European banks can't get dollars? Isn't that just a European problem? Not really.

Despite efforts to reduce their exposure to Europe's debt problems, U.S. money-market funds are still owed hundreds of billions of dollars from banks across the Atlantic. All it would take is one bank default to throw the entire U.S. money market system into panic. Sound far-fetched? It's not.

A very similar scenario played out less than two years ago. During the week of September 15, 2008, Lehman Brothers infamously collapsed. At that time, Lehman owed $785 million to the Reserve Primary Fund, one of the oldest and largest money-market funds in the world. The Reserve Fund, now sitting on a massive loss, suffered the unthinkable: it "broke the buck", meaning the value of a share fell below 1 dollar to 97 cents.

Millions of average investors panicked, facing losses on investments believed to be as safe as a savings account and withdrawing $300 million from the U.S. money-market system. Even worse, other funds were so spooked they stopped lending dollars to banks altogether, both domestic and foreign, depriving the global economy of a vital source of credit. For a moment, the whole system appeared to be on the edge of disaster, until the U.S. Treasury and Federal Reserve stepped in to save the day through a barrage of emergency policies that guaranteed investments and increased global dollar liquidity.

So what are the chances of a Lehman-style collapse in Europe today? No one knows exactly, but global credit market skittishness is indicative of a belief that such an event is definitely possible. If Greece (or another heavily debt-ridden European country, such as Spain) defaults on its debt, money-market lending could seize up entirely -- when European banks came back to the funds to try and roll over their loans, they would be rebuffed. Without access to dollar-funding markets, these banks wouldn't be able to pay back what they owe when it came due. And, as the Lehman case showed, if one money-market fund absorbs a massive loss from a defaulting European bank, investors in all such funds will run for the exits, threatening to bring down the whole system.

That's the bad news. But there's also some good news. The Fed has already taken some steps to make sure the scenario portrayed above doesn't occur. How? By doing the same thing it did back in 2007 and 2008, when the global credit system faced the same problem: It opened "currency swaps," or swap lines, with major foreign central banks three weeks ago.

The swap lines essentially provide dollars directly to foreign central banks, who can then deliver the currency to domestic banks that are having difficulty acquiring dollars from fearful market participants, like money-market funds. In essence, the Fed is indirectly providing credit to European banks to protect U.S. money-market funds and banks that are owed money from beleaguered European financial institutions. Recent research suggests that the swap program was quite effective at reducing strains in credit markets the last time around, so there is reason for optimism.

So far, the swap lines have seen nowhere near the use they did at the height of the subprime crisis, which is also encouraging. Nonetheless, most analysts expect the situation in dollar-funding markets to get worse over the coming weeks, so their popularity may rise. If strains in credit markets spread outside of Europe, the Fed may expand the program to other economies facing difficulty.

It appears indeed that history might just repeat itself, as the saying goes. And while the prospect of facing Global Credit Crunch 2.0 only a year after the last version is no doubt an unpalatable scenario, at least we'll know what we're getting into.

AFP/Getty Images