Voice

Where Do Bad Ideas Come From?

And why don't they go away?

We would all like to think that humankind is getting smarter and wiser and that our past blunders won't be repeated. Bookshelves are filled with such reassuring pronouncements, from the sage advice offered by Richard Neustadt and Ernest May in Thinking in Time: The Uses of History for Decision Makers to the rosy forecasts of Matt Ridley's The Rational Optimist: How Prosperity Evolves, not to mention Francis Fukuyama's famously premature claim that humanity had reached "the end of history." Encouraging forecasts such as these rest in part on the belief that we can learn the right lessons from the past and cast discredited ideas onto the ash heap of history, where they belong.

Those who think that humanity is making steady if fitful progress might point to the gradual spread of more representative forms of government, the largely successful campaign to eradicate slavery, the dramatic improvements in public health over the past two centuries, the broad consensus that market systems outperform centrally planned economies, or the growing recognition that action must be taken to address humanity's impact on the environment. An optimist might also point to the gradual decline in global violence since the Cold War. In each case, one can plausibly argue that human welfare improved as new knowledge challenged and eventually overthrew popular dogmas, including cherished but wrongheaded ideas, from aristocracy to mercantilism, that had been around for centuries.

Yet this sadly turns out to be no universal law: There is no inexorable evolutionary march that replaces our bad, old ideas with smart, new ones. If anything, the story of the last few decades of international relations can just as easily be read as the maddening persistence of dubious thinking. Like crab grass and kudzu, misguided notions are frustratingly resilient, hard to stamp out no matter how much trouble they have caused in the past and no matter how many scholarly studies have undermined their basic claims.

Consider, for example, the infamous "domino theory," kicking around in one form or another since President Dwight D. Eisenhower's 1954 "falling dominoes" speech. During the Vietnam War, plenty of serious people argued that a U.S. withdrawal from Vietnam would undermine America's credibility around the world and trigger a wave of pro-Soviet realignments. No significant dominoes fell after U.S. troops withdrew in 1975, however, and it was the Berlin Wall that eventually toppled instead. Various scholars examined the domino theory in detail and found little historical or contemporary evidence to support it.

Although the domino theory seemed to have been dealt a fatal blow in the wake of the Vietnam War, it has re-emerged, phoenix-like, in the current debate over Afghanistan. We are once again being told that if the United States withdraws from Afghanistan before achieving a clear victory, its credibility will be called into question, al Qaeda and Iran will be emboldened, Pakistan could be imperiled, and NATO's unity and resolve might be fatally compromised. Back in 2008, Secretary of State Condoleezza Rice called Afghanistan an "important test of the credibility of NATO," and President Barack Obama made the same claim in late 2009 when he announced his decision to send 30,000 more troops there. Obama also justified his decision by claiming that a Taliban victory in Afghanistan would spread instability to Pakistan. Despite a dearth of evidence to support these alarmist predictions, it's almost impossible to quash the fear that a single setback in a strategic backwater will unleash a cascade of falling dominoes.

There are other cases in which the lessons of the past -- sadly unlearned -- should have been even more obvious because they came in the form of truly devastating catastrophes. Germany's defeat in World War I, for example, should seemingly have seared into Germans' collective consciousness the lesson that trying to establish hegemony in Europe was almost certain to lead to disaster. Yet a mere 20 years later, Adolf Hitler led Germany into another world war to achieve that goal, only to suffer an even more devastating defeat.

Similarly, the French experience in Vietnam and Algeria should have taught American leaders to stay out of colonial independence struggles. In fact, French leaders warned Lyndon B. Johnson that the United States would lose in Vietnam, but the U.S. president ignored their advice and plunged into a losing war. The resulting disastrous experience in Vietnam presumably should have taught future presidents not to order the military to do "regime change" and "nation-building" in the developing world. Yet the United States has spent much of the past decade trying to do precisely that in Iraq and Afghanistan, at great cost and with scant success.

Why is it so hard for states to learn from history and, especially, from their own mistakes? And when they do learn, why are some of those lessons so easily forgotten? Moreover, why do discredited ideas come back into fashion when there is no good reason to resurrect them? Clearly, learning the right lessons -- and remembering them over time -- is a lot harder than it seems. But why?

THE LIMITS OF KNOWLEDGE

For starters, even smart people with good intentions have difficulty learning the right lessons from history because there are relatively few iron laws of foreign policy and the facts about each case are rarely incontrovertible.

And unfortunately, the theories that seek to explain what causes what are relatively crude. When a policy fails, reasonable people often disagree about why success proved elusive. Did the United States lose in Vietnam because the task was inherently too difficult, because it employed the wrong military strategy, or because media coverage undermined support back home? Interpreting an apparent success is no easier: Did violence in Iraq decline in 2007 because of the "surge" of U.S. troops, because al Qaeda affiliates there overplayed their hand, or because ethnic cleansing had created homogeneous neighborhoods that made it harder for Shiites and Sunnis to target each other? The implications for today depend on which of these interpretations you believe, which means that consensus about the "lessons" of these conflicts will be elusive and fragile.

What's more, even when past failures have discredited a policy, those who want to resurrect it can argue that new knowledge, new technology, or a clever new strategy will allow them to succeed where their predecessors failed. For more than 20 years, for example, a combination of academic economists and influential figures in the finance industry convinced many people that we had overcome the laws of economic gravity -- that sophisticated financial models and improved techniques of risk management like financial derivatives allowed governments to relax existing regulations on financial markets. This new knowledge, they argued, permitted a vast expansion of new credit with little risk of financial collapse. They were tragically wrong, of course, but a lot of smart people believed them.

Similarly, the Vietnam War did teach a generation of U.S. leaders to be wary of getting dragged into counterinsurgency wars. That cautious attitude was reflected in the so-called Powell doctrine, which dictated that the United States intervene only when its vital interests were at stake, rely on overwhelming force, and identify a clear exit strategy in advance. Yet after the U.S. military routed the Taliban in 2001, key figures in President George W. Bush's administration became convinced that the innovative use of special forces, precision munitions, and high-tech information management (together dubbed a "revolution in military affairs") would enable the United States to overthrow enemy governments quickly and cheaply and avoid lengthy occupations, in sharp contrast to past experience. The caution that inspired the Powell doctrine was cast aside, and the result was the war in Iraq, which dragged on for almost eight years, and the war in Afghanistan, where the United States seems mired in an endless occupation.

STRONG BUT FOOLISH STATES

All countries have obvious incentives to learn from past mistakes, but those that have successfully risen to the status of great powers may be less inclined to adapt quickly in the future. When it comes to learning the right lessons, paradoxically, nothing fails like prior success.

This wouldn't seem to make sense. After all, strong and wealthy states can afford to devote a lot of resources to analyzing important foreign-policy problems. But then again, when states are really powerful, the negative consequences of foolish behavior rarely prove fatal. Just as America's "Big Three" automakers were so large and dominant they could resist reform and innovation despite ample signs that foreign competition was rapidly overtaking them, strong and wealthy states can keep misguided policies in place and still manage to limp along for many years.

The history of the Soviet Union offers an apt example of this phenomenon. Soviet-style communism was woefully inefficient and brutally inhumane, and its Marxist-Leninist ideology both alarmed the capitalist world and created bitter schisms within the international communist movement. Yet the Soviet Union survived for almost 70 years and was one of the world's two superpowers for more than four decades. The United States has also suffered serious self-inflicted wounds on the foreign-policy front in recent decades, but the consequences have not been so severe as to compel a broader reassessment of the ideas and strategies that have underpinned many of these mistakes.

The tendency to cling to questionable ideas or failed practices will be particularly strong when a set of policy initiatives is bound up in a great power's ruling ideology or political culture. Soviet leaders could never quite abandon the idea of world revolution, and defenders of British and French colonialism continued to see it as the "white man's burden" or "la mission civilisatrice." Today, U.S. leaders remain stubbornly committed to the goals of nation-building and democracy promotion despite their discouraging track record with these endeavors.

Yet because the universal ideals of liberty and democracy are core American principles, it is hard for U.S. leaders to acknowledge that other societies cannot be readily remade in America's image. Even when U.S. leaders recognize that they cannot create "some sort of Central Asian Valhalla," as Defense Secretary Robert Gates acknowledged in 2009, they continue to spend billions of dollars trying to build democracy in Afghanistan, a largely traditional society that has never had a strong central state, let alone a democratic one.

CLOSED SOCIETIES AND CLOSED MINDS

In theory, democracies like the United States should have a built-in advantage. When governments stifle debate and restrict the public's access to information, bogus ideas and misguided policies are less likely to be exposed and either corrected or abandoned. In his masterful study of human-induced folly, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed, James Scott argues that great man-made disasters arise when authoritarian governments pursue radical social transformations that are based on supposedly "scientific" principles such as Marxism-Leninism or Swiss architect Le Corbusier's "urban modernism." Because such schemes epitomize a certain notion of "progress" and also enhance central control, ambitious political leaders are understandably drawn to them. But because authoritarian regimes routinely suppress dissent, these same leaders may not learn that their ambitious schemes are failing until it is too late to prevent catastrophe. In the same vein, Nobel-winning economist Amartya Sen famously argued that authoritarian regimes are more prone to mass famines because such regimes lack the accountability and feedback mechanisms that give rulers a strong incentive to identify and correct mistakes in a timely manner.

Yet democracies, though they conceal less information and hold leaders more accountable, are hardly immune to similar pathologies. We often assume that open discourse and democratic debate will winnow out foolish policies before they can do much damage, but the "marketplace of ideas" that is supposed to perform that function is far from perfect. In separate accounts of the Bush administration's successful campaign to sell the invasion of Iraq, political scientist Chaim Kaufmann and New York Times columnist Frank Rich showed how easily democratic leaders can convince skeptical publics to go to war. Bush and his colleagues built support for the invasion by framing options in deliberately biased ways, manipulating a highly deferential media, and exploiting their control over classified information. The result was a nearly nonexistent debate about the wisdom of the war, with the deck heavily stacked in the president's favor.

The same strategy works to shield leaders from accountability: By concealing information behind walls of secrecy and classification, democratic as well as nondemocratic governments can cover up embarrassing policy failures and make it difficult to learn the right lessons from past mistakes. If citizens and scholars do not know what government officials have done and what the subsequent consequences of their actions are, it is impossible for them to assess whether those hidden policies made sense. To take an obvious current example, it is impossible for outside observers to evaluate the merits of U.S. drone attacks on suspected al Qaeda leaders without detailed information about the actual success rate, including the number of missed targets and innocent civilians killed, as well as the effects of those deaths on terrorist recruitment and anti-American attitudes more generally. When Pentagon officials tell us that increased drone strikes are working, we just have to take them at their word. Maybe they're right, but if they aren't, we won't know until long after the damage has been done.

The same problem can also arise when information is widely available but a subject is considered taboo and thus outside the boundaries of acceptable public discourse. John Mearsheimer and I argue that U.S. policy in the Middle East suffers from this problem because it is nearly impossible for American policymakers and politicians to question Washington's "special relationship" with Israel or criticize Israeli policy without triggering a hostile reaction, including being smeared as an anti-Semite or a "self-hating Jew." Ironically, making it harder for U.S. officials to tell Israel when its actions are misguided is harmful to Israel; a more open discourse and a more normal relationship would be better for both countries.

A similar taboo seems to be emerging in the realm of civil-military relations. With the United States mired in two lengthy conflicts, American politicians feel a need to constantly reiterate their support for "the troops" and their respect for the generals who run our wars, especially media-savvy commanders like Gen. David Petraeus. Criticizing the military would invite others to question one's patriotism and therefore is out of bounds. This trend is not healthy because civilians who are overly deferential to the military are unlikely to question military advice, even when it might be bad for the troops as well as the country. But generals are as fallible as the rest of us and should not receive a free pass from their civilian counterparts.

In short, whenever it becomes politically dangerous to challenge prevailing orthodoxies, misplaced policies are more likely to go unquestioned and uncorrected. Wouldn't it have been better if more well-placed people had objected to the U.S. decision to build massive nuclear overkill (including 30,000-plus nuclear warheads) during the Cold War, questioned the enduring fears of "monolithic communism" and Soviet military superiority, or challenged the wisdom of three decades of financial deregulation? Some did express such qualms, of course, but doing so loudly and persistently was a good way to find oneself excluded from the political mainstream and certainly from the highest corridors of power.

CUI BONO? BAD IDEAS COME FROM SOMEWHERE

Perhaps the most obvious reason why foolish ideas persist is that someone has an interest in defending or promoting them. Although open debate is supposed to weed out dubious ideas and allow facts and logic to guide the policy process, it often doesn't work that way. Self-interested actors who are deeply committed to a particular agenda can distort the marketplace of ideas.

A case in point is the long-standing U.S. embargo on Cuba, imposed in 1960 with the purpose of toppling Fidel Castro. It is hard to think of a better example of a failed policy that has remained in place for decades despite clear evidence that it is not just a failure, but actively counterproductive. If the embargo were going to bring Castro down, it surely would have happened by now, yet it is kept alive by the political influence of the Cuban-American lobby. Protectionist tariffs and farm subsidies illustrate the same problem. Every undergraduate economics major knows that these programs waste money and reduce overall consumer welfare; yet farmers, factory owners, and labor unions threatened by foreign competition routinely demand subsidies or protection, and they too often receive it. The same thing is true for costly initiatives like ballistic-missile defense, which has been assiduously promoted by aerospace and defense contractors with an obvious interest in getting the Pentagon to fill their coffers at public expense -- never mind that it might not actually work.

Even in areas where there is a clear scientific consensus, like climate change, public discourse has been distorted by well-organized campaigns to discredit the evidence and deny that any problem exists. Not surprisingly, those whose economic interests would be hurt if we significantly reduced our reliance on fossil fuels have aggressively funded such campaigns.

In the United States, this problem with self-interested individuals and groups interfering in the policy process appears to be getting worse, in good part because of the growing number of think tanks and "research" organizations linked to special interests.

Organizations like the American Enterprise Institute, the Center for a New American Security, the Washington Institute for Near East Policy, and the Center for American Progress -- to name but a few -- are not politically neutral institutions, in that their ultimate purpose is to assemble and disseminate arguments that advance a particular worldview or a specific policy agenda. The people who work at these institutions no doubt see themselves as doing serious and objective analysis -- and many probably are -- but such organizations are unlikely to recruit or retain anyone whose research challenges the organization's central aims. Their raison d'être, after all, is the promotion of policies favored by their founders and sponsors.

In addition to advocating bad ideas even after they have been found wanting, many of these institutions also make it harder to hold public officials accountable for major policy blunders. For example, one would think that the disastrous war in Iraq would have discredited and sidelined the neoconservatives who dreamed up the idea and promoted it so assiduously. Once out of office, however, they returned to friendly think tanks and other inside-the-Beltway sinecures and resumed their efforts to promote the discredited policies they had favored when they were in government. When a country's foreign-policy elite is insulated from failure and hardly anyone is held accountable, it will be especially difficult to learn from the past and formulate wiser policies in the future.

The rise of the Internet and blogosphere may have facilitated more open and freewheeling public debate about controversial issues, and websites like YouTube and WikiLeaks have fostered greater transparency and made the marketplace of ideas somewhat more efficient. In the blogosphere, at least, it is no longer taboo to talk critically about the "special relationship" with Israel, even if politicians and mainstream media figures remain reticent.

Nevertheless, there is a downside to these encouraging developments. The proliferation of websites and cable news outlets encourages some people to consume only the news and analysis that reinforce their existing views. Thus, a 2010 survey by the Pew Research Center for the People and the Press found that 80 percent of those who regularly listen to radio host Rush Limbaugh or watch Fox News's Sean Hannity are conservatives, even though conservatives are only 36 percent of the U.S. population. Similarly, the audience for MSNBC's Keith Olbermann and Rachel Maddow has nearly twice the fraction of liberals as the general public.

Moreover, competition between a growing number of news outlets seems to be fostering a media environment in which reasoned discourse matters less than entertainment value. Anyone who thinks that major issues of public policy should be dealt with on the basis of logic and evidence cannot help but be alarmed by the growing prominence of Glenn Beck and the know-nothing defiance of the Tea Party.

THE UNITED STATES OF AMNESIA

Last but not least, discredited ideas sometimes come back to life because societies simply forget important lessons about the past. Political psychologists generally agree that personal experiences have a disproportionate impact on our political beliefs, and lessons learned by older generations rarely resonate as strongly with their successors. And besides, as the years go by it becomes easier to argue that circumstances have changed and that "things are different now," encouraging the wrong-headed view that previous wisdoms about how to deal with particular problems might no longer hold. Of course, sometimes those arguments will be correct -- there are few timeless verities in political life -- and even seemingly unassailable truths might someday be seriously challenged if not discredited. All this just further complicates the problem of learning and retaining the right lessons from the past.

Regrettably, there is no hope of ever making the learning process work smoothly and flawlessly -- which is all the more reason why we have little choice but to be wary of firmly entrenched conventional wisdoms, wherever they're from, and relentlessly question our own judgments about the past as well.

For it just might turn out that a radically different version of events is the correct one, closer to the truth than our present reading of the past. Vigorous, unfettered, yet civil debate remains the most reliable mechanism for acquiring greater wisdom for the future. In the long run we are all dead, as John Maynard Keynes memorably quipped, but humanity could at least get something out of it.

Illustration by Aaron Goodman for FP

Feature

Running the World, After the Crash

Has the era of global cooperation ended before it began?

International cooperation has stalled. From climate change and trade to nuclear nonproliferation and U.N. reform, macroeconomic rebalancing and development funding -- and the list could go on -- nearly every major initiative to solve the new century's most pressing problems has ground to a standstill amid political gridlock, summit pageantry, and perfunctory news conferences.

We're trapped in a debilitating paradox. People around the world increasingly perceive their interconnectedness and interdependence. In principle, they recognize that this implies a need for closer international cooperation. Yet governance at all levels -- public and private as well as global, national, and local -- is struggling to adapt.

For a moment after the financial crash of late 2008, humanity was seized with the transformational nature of our times. Out of the economic crisis emerged a consensus across governments and the business world that deep reforms were needed in existing systems of international cooperation. This view helped prompt the historic expansion of the G-8, which had been the world's economic steering committee, to 20 countries and the pledge of the new G-20 leaders in London to "lay the foundation for a fair and sustainable world economy."

But as the emergency has receded, so too has the appetite for fundamental reform. Numerous promises made by G-20 leaders remain unfulfilled and might be abandoned outright. They pledged to "take strong action to address the threat of dangerous climate change." They committed to "refrain from competitive devaluation of our currencies and promote a stable and well-functioning international monetary system." They promised to tackle "the social dimension of the crisis" by strengthening safety nets and implementing the International Labor Organization's global jobs agenda. And they vowed to vanquish poverty by "mobilizing all resources for development," particularly through the Millennium Development Goals. Across the board, they haven't done what they said they'd do -- or else come up dramatically short.

Even on financial regulatory reform, where progress has been substantial, many of the most important issues have yet to be fully addressed, from ending the notion of banks that are "too big to fail" to shining a bright light on the murky financial instruments that caused the crisis to giving the international financial system's new watchdog, the Financial Stability Board, the independent authority and capacity it needs to do its job. And on and on.

The world has already paid a severe price for its complacency about well-known systemic financial and macroeconomic risks. It would be a historic error -- a generational abdication of responsibility -- to revert to business as usual. And while we dither, other global risks are accumulating that will surely not be adequately addressed by the weak institutions we now possess, whether the challenges of water scarcity and malnutrition or those of nuclear proliferation and biodiversity loss, never mind the crises of failing education systems and stubborn unemployment or the rapidly expanding threat of chronic diseases and cyberattacks.

But we don't need a big-bang demolition and replacement of the existing international architecture, as some are advocating. Instead, we must recognize that solving problems like global warming and economic imbalances is going to require much more multifaceted solutions than are being contemplated today. The old way of tackling important issues -- holding a summit, issuing a statement, starting a new government program -- no longer cuts it.

THE MODERN INTERNATIONAL system has three main architectural features that were built in stages, one on top of the other. The first and most fundamental is the nation-state, which remains the primary instrument of international relations. The second is alliances among major nation-states, starting with the 1815 Congress of Vienna and manifested today in alliances like NATO and the Association of Southeast Asian Nations. The third is the formal multilateral U.N. system, which was largely constructed in the 1940s in the aftermath of World War II.

Nation-states and intergovernmental structures aren't going anywhere. Countries and multilateral bodies like the United Nations will continue to play a central role in global decision-making (though international institutions should certainly be reformed to better reflect the rising influence of countries like Brazil, China, and India). But they have been overtaken in key respects by the sweeping changes of the past generation. Not only have countries become more economically and environmentally interdependent, but people have, too. Around the globe, they increasingly seek ways to express themselves politically outside formal national governmental channels, whether through NGOs, business trade associations, international media outlets, or virtual professional and social networks on the Internet. They have become more aware that global problems require global trusteeship and that efforts to solve them solely through traditional means -- government to government -- might be inadequate.

We saw that clearly over the last year when we convened the World Economic Forum's Global Redesign Initiative, pulling together more than 50 global task forces to recommend solutions. What they told us varied, but one finding was unequivocal: The international system requires a new round of structural renovation that builds on the previous three. The task this time is not simply to add a new feature. Rather, it is to rewire the connections -- to make them work between old and new, and bottom up as well as top down -- so that the system overcomes the serious limitations of scale, information, and coherence that complex interdependence has exposed in it.

Let's begin with scale. Many of the most crucial collective-action problems the world faces, such as poverty, income inequality, and climate change, require the mobilization of resources on an order of magnitude that far exceeds the capability of governments and international organizations. Private financial flows from rich to poor countries, for example, are already about four times greater than foreign aid, and even this amount is but a small fraction of the funds sloshing around the world in bond and equity markets each year. But institutions like the World Bank have only begun to experiment with partnerships that tap into the far greater power of the private sector and civil society. For this to change, the bank and bilateral aid agencies such as the U.S. Agency for International Development, or even international negotiations like the U.N. climate talks, need to reposition themselves as enablers of solutions rather than direct providers of them. This will require profound changes in their skill base, management culture, and budgetary priorities, which still largely reflect the world as it was in the 1970s, before globalization and the expansion of the private sector took off.

Then there is the information gap. These days, governments and intergovernmental institutions have a hard time simply keeping up with the pace of change. An obvious example is the spread of complex financial instruments like collateralized debt obligations and credit-default swaps -- and we know how that turned out. Even billionaire investor Warren Buffett, who warned to no avail in 2003 that derivatives were "financial weapons of mass destruction," has admitted that he often doesn't understand them. Many regulators and CEOs are equally in the dark, if less candid about it. Or take the U.S. State Department, which relies on NGOs like the International Crisis Group and Human Rights Watch to report on unfolding threats to stability in undercovered parts of the world. Government officials now need to cast a much wider net, systematically embedding themselves in diverse, often informal networks of expertise. They can no longer claim to be paramount authorities in and of themselves.

Finally, coherence is a growing problem. In nearly every organization, public or private, stories abound of one department working at cross-purposes with another. But governmental bureaucracies, by their very nature, tend to put issues in discrete boxes, reflecting their organizational charts instead of the real world. This tendency often produces fragmented, partial, and sometimes even incoherent responses to problems. It is exacerbated by highly fragmented governance. Most international organizations fall under the authority of different ministers who guard their turf jealously. And unlike in a company, there is no single CEO or board to set priorities across the various bureaucracies and drive coordination, even though the issues facing the International Monetary Fund, World Trade Organization, International Energy Agency, and others are highly interconnected. In theory, G-20 leaders could propel this kind of coordination and prioritization. But so far, they haven't.

EVEN IN TODAY'S more complex political landscape, multilateral rules and organizations remain important. But three other modes of international cooperation are also crucial: high-level political commitments and objectives, involving public summit declarations in which leaders invest their personal political capital; pragmatic coalitions of the willing and able, including multiple countries or stakeholders, to make progress where it is most feasible and important; and common information metrics, like Transparency International's Corruption Perceptions Index or the U.N.'s Millennium Ecosystem Assessment, to assist with anticipating risks, shaping priorities, and benchmarking performance.

If the international community focuses only on one of these, it is much more likely to be disappointed with the results. Of course, these individual elements are not new. The novelty is in conceiving of them as a wider ecosystem of international cooperation, a set of modular building blocks that can be assembled in different combinations to improve the performance of the international system on various problems.

What does this mean in practice? Let's take climate change, perhaps the most difficult and important global problem of all, and one where the current international system's failings are on full display.

Over the last few years, the world has lost valuable time by focusing exclusively on getting a new multilateral climate treaty to replace the Kyoto Protocol, whose first set of commitments by developed countries expires in 2012. Even if countries somehow reach a breakthrough, the result likely won't be enough to stabilize the planet's average temperature at 2 degrees Celsius above preindustrial levels by midcentury, as scientists recommend. An unprecedented shift in how we harvest and use energy must take place soon -- during the next 10 to 20 years -- and based on what the political traffic is likely to bear in a 192-country U.N. negotiation, we're not going to make it.

For this reason, the post-Kyoto climate regime should look very different than its predecessor. It should be not only top-down -- a set of quantitative national commitments to reduce emissions -- but also a bottom-up set of mechanisms and initiatives capable of transforming behavior in the most relevant sectors of the world economy over the next 10 to 15 years. Governments must create clarity soon about the successor to Kyoto, but they should be partnering furiously with business and civil society to build enabling institutions that scale the application of efficient existing technologies (and boost research into promising new ones), create alternative livelihoods for communities inclined to convert tropical rain forests to farmland, catalyze private investment in low-carbon infrastructure, and establish common accounting and product-labeling standards that empower investors and consumers to transmit market signals to producers that take full account of carbon-related risks and costs.

In other words, we need a results-oriented, bottom-up economic strategy to complement and enable the more procedural, top-down, political one that environment and foreign ministers have been struggling to create at the United Nations. Because about three-quarters of greenhouse gas emissions are concentrated in four sectors (power, forestry, agriculture, and transportation) and in 20 countries (essentially the G-20), a multilateral agreement is not strictly speaking necessary. But what is urgently needed is a set of public-private mechanisms to dramatically lower our reliance on carbon-intensive technologies. To build them, governments will need to organize themselves more creatively, for instance by bringing their economic teams, not just their environment ministries, to the table.

That kind of interdisciplinary, multidimensional approach could also breathe new life into other tough global fights, from the battle against unemployment and inequality to the struggle to improve education, health, and nutrition in poor countries to the race to save ocean fisheries from collapse. The key is to explore systematically how we can raise our game at every level of the emerging international system.

Here's where the G-20 comes in. Only leaders can compel the various fiefdoms of ministers and the international organizations they lord over to align strategy and systematically tap the expertise and resources of relevant nonstate actors. And only the G-20 combines enough of the most influential leaders to make this happen across the system, even as it is still struggling to contend with the aftershocks of the crisis. The global discussions about climate change, the Doha round of trade talks, IMF and World Bank reform, Millennium Development Goals funding, and macroeconomic rebalancing are all essentially blocked and unable to progress much further within their separate silos. Yet everyone -- developing, emerging, and advanced countries alike -- would be a net winner from their approval. Dwight Eisenhower, in his role as the supreme Allied commander during World War II, once said, "Whenever I run into a problem I can't solve, I always make it bigger." That's exactly what leaders need to do now: assemble a package deal whose political benefit is greater than the sum of its parts.

IN 1944, well before the end of the war but after the tide had turned, the United States hosted a series of discussions at Dumbarton Oaks in Washington, D.C., and in Bretton Woods, New Hampshire. The goal was nothing less than to design the postwar international security and economic architecture. These conferences seem almost quaint six-and-half decades later. They were remarkable not only because they honed ideas that were later adopted formally by governments in the form of the United Nations and the Bretton Woods institutions, but also because of their intellectually expansive and relatively informal nature. Even as governments were busy prosecuting a world war, they assembled teams of people with governmental, academic, and business backgrounds to engage in sustained discussions aimed at drawing fundamental lessons about the failures of prewar cooperation and designing new, more robust, international security and economic institutions.

Now that the worst of the economic crisis appears to be over, it's time to revive the ecumenical spirit of Dumbarton Oaks and Bretton Woods. Even as governments begin unwinding their fiscal and monetary stimulus measures, they need to do some hard thinking about how they can adapt to the sweeping changes that are rapidly passing them by.

The most important factor is going to be willpower. The 1919 Treaty of Versailles is an object lesson in what happens when countries fail to draw the appropriate lessons from history and rise above national interests in the aftermath of an international crisis. The League of Nations illustrates what happens when aspirations for a better world are not accompanied by the appropriate frameworks and resources. The diplomats, academics, and industrialists assembled at Dumbarton Oaks and Bretton Woods were haunted by these failures and determined not to repeat them. They avoided doing so by thinking in structural rather than incremental, and in systemic rather than purely parochial, terms.

The world faces the same basic challenge today. International cooperation is now everybody's business, and it's time for all of us to rise above our immediate preoccupations and consider more seriously our stake in a properly structured and resourced global system for the 21st century. We are more likely to succeed in doing so if we take a practical, multifaceted approach, focusing at least as much on the how as the what. This may prove a difficult challenge. But one thing is certain: We can't keep doing the same old thing.

Stephen Jaffe/IMF via Getty Images; John Moore/Getty Images; Pool/AFP/Getty Images; Alexander Nemenov/AFP/Getty Images; Martin Bernetti/AFP/Getty Images; Peter Macdiarmid/Getty Images