Think Again

Think Again: Intelligence

I served in the CIA for 28 years and I can tell you: America's screw-ups come from bad leaders, not lousy spies.

"Presidents Make Decisions Based on Intelligence."

Not the big ones. From George W. Bush trumpeting WMD reports about Iraq to this year's Republican presidential candidates vowing to set policy in Afghanistan based on the dictates of the intelligence community, Americans often get the sense that their leaders' hands are guided abroad by their all-knowing spying apparatus. After all, the United States spends about $80 billion on intelligence each year, which provides a flood of important guidance every week on matters ranging from hunting terrorists to countering China's growing military capabilities. This analysis informs policymakers' day-to-day decision-making and sometimes gets them to look more closely at problems, such as the rising threat from al Qaeda in the late 1990s, than they otherwise would.

On major foreign-policy decisions, however, whether going to war or broadly rethinking U.S. strategy in the Arab world (as President Barack Obama is likely doing now), intelligence is not the decisive factor. The influences that really matter are the ones that leaders bring with them into office: their own strategic sense, the lessons they have drawn from history or personal experience, the imperatives of domestic politics, and their own neuroses. A memo or briefing emanating from some unfamiliar corner of the bureaucracy hardly stands a chance.

Besides, one should never underestimate the influence of conventional wisdom. President Lyndon B. Johnson and his inner circle received the intelligence community's gloomy assessments of South Vietnam's ability to stand on its own feet, as well as comparably pessimistic reports from U.S. military leaders on the likely cost and time commitment of a U.S. military effort there. But they lost out to the domino theory -- the idea that if South Vietnam fell to communism, a succession of other countries in the developing world would as well. President Harry Truman decided to intervene in Korea based on the lessons of the past: the Allies' failure to stand up to the Axis powers before World War II and the West's postwar success in firmly responding to communist aggression in Greece and Berlin. President Richard Nixon's historic opening to China was shaped by his brooding in the political wilderness about great-power strategy and his place in it. The Obama administration's recent drumbeating about Iran is largely a function of domestic politics. Advice from Langley, for better or worse, had little to do with any of this.

Alex Wong/Getty Images

"Bad Intelligence Led to the Iraq War."

No, bad leadership did. Intelligence may have figured prominently in Bush's selling of the invasion of Iraq, but it played almost no role in the decision itself. If the intelligence community's assessments pointed to any course of action, it was avoiding a war, not launching one.

When U.S. Secretary of State Colin Powell went before the United Nations in February 2003 to make the case for an invasion of Iraq, he argued, "Saddam Hussein and his regime are concealing their efforts to produce more weapons of mass destruction," an observation he said was "based on solid intelligence." But in a candid interview four months later, Deputy Defense Secretary Paul Wolfowitz acknowledged that weapons of mass destruction were simply "the one issue that everyone could agree on." The intelligence community was raising no alarms about the subject when the Bush administration came into office; indeed, the 2001 edition of the community's comprehensive statement on worldwide threats did not even mention the possibility of Iraqi nuclear weapons or any stockpiles of chemical or biological weapons. The administration did not request the (ultimately flawed) October 2002 intelligence estimate on Iraqi unconventional weapons programs that was central to the official case for invasion -- Democrats in Congress did, and only six senators and a handful of representatives bothered to look at it before voting on the war, according to staff members who kept custody of the copies. Neither Bush nor Condoleezza Rice, then his national security advisor, read the entire estimate at the time, and in any case the public relations rollout of the war was already under way before the document was written.

Had Bush read the intelligence community's report, he would have seen his administration's case for invasion stood on its head. The intelligence officials concluded that Saddam was unlikely to use any weapons of mass destruction against the United States or give them to terrorists -- unless the United States invaded Iraq and tried to overthrow his regime. The intelligence community did not believe, as the president claimed, that the Iraqi regime was an ally of al Qaeda, and it correctly foresaw any attempt to establish democracy in a post-Saddam Iraq as a hard, messy slog.

In a separate prewar assessment, the intelligence community judged that trying to build a new political system in Iraq would be "long, difficult and probably turbulent," adding that any post-Saddam authority would face a "deeply divided society with a significant chance that domestic groups would engage in violent conflict with each other unless an occupying force prevented them from doing so." Mentions of Iraqis welcoming U.S. soldiers with flowers, or the war paying for itself, were notably absent. Needless to say, none of that made any difference to the White House.

TIMOTHY A. CLARY/AFP/Getty Images 

"Intelligence Failures Have Screwed Up U.S. Foreign Policy."

Hardly. The record of 20th-century U.S. intelligence failures is a familiar one, and mostly indisputable. But whether these failures -- or the successes -- mattered in the big picture is another question.

The CIA predicted both the outbreak and the outcome of the 1967 Six-Day War between Israel and neighboring Arab states, a feat impressive enough that it reportedly won intelligence chief Richard Helms a seat at President Johnson's Tuesday lunch table. Still, top-notch intelligence couldn't help Johnson prevent the war, which produced the basic contours of today's intractable Israeli-Palestinian conflict, and U.S. intelligence completely failed to predict Egypt's surprise attack on Israel six years later. Yet Egypt's nasty surprise in 1973 didn't stop Nixon and Secretary of State Henry Kissinger from then achieving a diplomatic triumph, exploiting the conflict to cement relations with Israel while expanding them with Egypt and the other Arab states -- all at the Soviets' expense.

U.S. intelligence also famously failed to foresee the 1979 Iranian revolution. But it was policymakers' inattention to Iran and sharp disagreements within President Jimmy Carter's administration, not bad intelligence, that kept the United States from making tough decisions before the shah's regime was at death's door. Even after months of disturbances in Iranian cities, the Carter administration -- preoccupied as it was with the Egypt-Israel peace negotiations and the Sandinistas' revolution in Nicaragua -- still had not convened any high-level policy meetings on Iran. "Our decision-making circuits were heavily overloaded," Zbigniew Brzezinski, Carter's national security advisor, later recalled.

Imperfect intelligence analysis about another coming political upheaval -- the collapse of the Soviet Union -- did not matter; the overriding influence on U.S. policy toward the USSR in the 1980s was Ronald Reagan's instincts. From the earliest days of his presidency, the notion that the Soviet Union was doomed to fail -- and soon -- was an article of faith for the 40th president. "The Russians could never win the arms race," he later wrote. "We could outspend them forever."

AFP/Getty Images

"U.S. Intelligence Underestimated al Qaeda Before 9/11."

No, it didn't. Like any terrorist attack, Sept. 11, 2001, was by definition a tactical intelligence failure. But though intelligence officials missed the attack, they didn't miss the threat. Years before 9/11, the intelligence community, especially the CIA, devoted unusually intense attention and effort to understanding Osama bin Laden's organization. The CIA created a special bin Laden-focused unit in early 1996, when al Qaeda was just beginning to take shape as the anti-American, transnational terrorist group we now know. President Bill Clinton stated in 1998 that "terrorism is at the top of the American agenda." He also launched a covert-action program against al Qaeda that included developing plans to capture bin Laden, even before the 1998 bombings of U.S. embassies in Africa.

When Clinton's national security officials handed over duties to their Bush administration successors, they emphasized the threat that would materialize on 9/11. Sandy Berger, the outgoing national security advisor, told Rice, "You're going to spend more time during your four years on terrorism generally and al Qaeda specifically than [on] any other issue." If more was not done in advance of 9/11 to counter the threat, it was because rallying public support for anything like a war in Afghanistan or costly, cumbersome security measures at home would have been politically impossible before terrorists struck the United States.

The most authoritative evidence of the intelligence community's pre-9/11 understanding of the subject is that same February 2001 worldwide threat statement that never mentioned Iraqi nukes or stockpiles of unconventional weapons. Instead it identified terrorism, and al Qaeda in particular, as the No. 1 threat to U.S. security -- ahead of weapons proliferation, the rise of China, and everything else. Bin Laden and his associates, the report said, were "the most immediate and serious threat" and were "capable of planning multiple attacks with little or no warning." It was all too correct.

STEPHEN JAFFE/AFP/Getty Images

"Hidebound Intelligence Agencies Refuse to Change."

You'd be surprised. Criticism of U.S. intelligence agencies -- at least the non-paranoid kind -- tends to portray them as stodgy bureaucracies that use their broad mandate for secrecy to shield themselves from the oversight that would make them do their jobs better. But the great majority of effective intelligence reforms have come from inside, not outside.

The organizational charts of the CIA and other U.S. intelligence agencies have undergone frequent and sometimes drastic revision, a recognition of the need to adapt to the rapidly changing world the agencies monitor and analyze. The CIA merged its analytic units covering East and West Germany in expectation of German reunification well before German unity was achieved in 1990. Other measures, such as developing greater foreign-language ability or training analysts in more sophisticated techniques, have been the focus of concentrated attention inside the agencies for years. The most effective, and probably most revolutionary, change in the intelligence community's work on terrorism was the creation of the CIA's Counterterrorist Center in 1986 -- a successful experiment that broke bureaucratic crockery, gathering previously separated collectors, analysts, and other specialists together to work side by side.

Reforms pursued from outside have received more public attention but have accomplished far less. After 9/11, the intelligence community underwent a reorganization when Congress acted on the 9/11 Commission's recommendation to make all spy agencies answerable to a single director of national intelligence. But the move has not, as hoped, unified the intelligence community, instead creating yet another agency sitting precariously atop 16 others. Because both the new director's office and the National Counterterrorism Center -- another commission recommendation -- added to, rather than replaced, existing government functions, they have further confused lines of responsibility. This much was made clear when would-be terrorist Umar Farouk Abdulmutallab tried to blow up a Detroit-bound passenger jet on Christmas Day 2009. The incident led to the same sorts of recriminations as those after 9/11, about information not being collated and dots not being connected -- only this time they were aimed at the 9/11 Commission's own creations.

Tom Williams/Roll Call

"Intelligence Has Gotten Better Since 9/11."

Yes, but not for the reasons you think. Having a veritable blank check for a decade makes a difference, of course. The big post-9/11 boom in the intelligence budget -- which has doubled since 2001, according to the Senate Intelligence Committee -- has at least marginally improved the odds of discovering the next nugget of information that will enable the United States to roll up a major terrorist plot or take down a bad guy.

But it was the dramatic and obvious change in U.S. priorities following 9/11 that made the most difference. Counterterrorism, more than any other intelligence mission, depends on close collaboration with other governments, which have the critical firsthand knowledge, local police, and investigative powers that the United States usually lacks. Prior to 9/11, those governments' willingness to cooperate was often meager, especially when it meant discomfiting local interests. After 9/11, however, U.S. officials could pound on the desks of their foreign counterparts and say, "This time we really mean it." Some results of this sea change -- successes in freezing or seizing terrorists' financial assets, for example -- have been visible. Many others have been necessarily less so. Future success or failure in tracking threats such as anti-U.S. extremism in South Asia will similarly depend more on the state of U.S.-Pakistan relations than on the performance of the bureaucracy back in Washington.

Cooperation among governments' counterterrorism services has often continued despite political differences between governments themselves. Ultimately, however, such cooperation rests on the goodwill the United States enjoys and the health of its relationships around the world. As 9/11 recedes into history, states' willingness to share information is a depleting asset. We appropriately think of intelligence as an important aid to foreign policy, but we also need to remember how much foreign policy affects intelligence.

Michael Williamson/The Washington Post

"Good Intelligence Can Save Us From Bad Surprises."

We wish. Early last February, barely a week before the Arab Spring ended the three-decade presidency of Egypt's Hosni Mubarak, Sen. Dianne Feinstein, chair of the Senate Intelligence Committee, grilled a CIA official in a Capitol Hill hearing room. "The president, the secretary of state, and the Congress are making policy decisions on Egypt, and those policymakers deserve timely intelligence analysis," Feinstein told Stephanie O'Sullivan, then the CIA's associate deputy director. "I have doubts whether the intelligence community lived up to its obligations in this area."

Feinstein was hardly the only one to criticize U.S. intelligence agencies' inability to predict the speed at which the fire lit by Tunisian fruit vendor Mohamed Bouazizi, who immolated himself on Dec. 17, 2010, would spread throughout the Arab world. But all the bureaucratic overhauls and investigative commissions in the world can't change one incontrovertible fact: Many things we would like our intelligence services to know are too complex to model or predict. What the community should be expected to provide -- and, based on the limited publicly available evidence, apparently did provide -- is a strategic understanding of conditions and attitudes that, given the right spark, could ignite into a full-blown revolution.

The most recent recriminations and inquiries are only the latest in a long line dating back to the 1941 surprise attack on Pearl Harbor. The resources devoted to intelligence have increased substantially over the past seven decades, and intelligence agencies are continually looking for ways to improve how they do their business. But no amount of moving around boxes on a flowchart can eliminate unpleasant surprises, and there will always be new challenges -- especially in an age of endlessly proliferating information.

Intelligence can help manage uncertainty, defining its scope and specifying what is known and what is likely to stay unknown. It can distinguish true uncertainty from simple ignorance by systematically assembling all available information, but it cannot eliminate uncertainty and it cannot prevent all surprises, including some big ones. Leaders must accept this reality; they must expect -- and prepare -- to be surprised.

With due acknowledgment to Donald Rumsfeld, it also means expecting unknown unknowns. Not only will we not know all the right answers -- we will not even be asking all the right questions.

Bill Clark/CQ Roll Call

Think Again

Think Again: Nuclear Power

Japan melted down, but that doesn't mean the end of the atomic age.

"Fukushima Killed the Nuclear Renaissance."

No. At first it looked like a natural disaster of epic proportions: shock waves rippling outward from a 9.0-magnitude earthquake off northeast Japan followed by a 30-foot tsunami, a one-two punch that all but obliterated the coastal city of Sendai and its environs. Then the electricity went off at the Fukushima Daiichi Nuclear Power Station, and a random act of natural destruction became a parable of technological society run amok. Stories of tsunami-leveled villages gave way to harrowing accounts of nuclear engineers trying, and failing, to stop the meltdown of first one, then a second, and finally a third reactor at Fukushima.

We'd seen this movie twice before, of course: first in 1979, when inexperienced operators allowed a reactor to overheat and melt down at Three Mile Island near Harrisburg, Pennsylvania, and most apocalyptically in 1986, when the reactor meltdown at Chernobyl forced the evacuation of hundreds of thousands of residents of what is now Ukraine and Belarus and all but finished off the Soviet economy. And in the wake of the March 11 Fukushima meltdown, commentators predicted the end of an industry that seemed to have finally escaped the shadows of its two earlier disasters. "All nuclear operators," Moody's Investors Service warned in an early April report, "will suffer the consequences that emerge from a post-Fukushima environment."

Indeed, in Japan, where support for nuclear power predictably, and understandably, fell from two-thirds of the public to one-third after the meltdown, plans for 14 reactors slated for construction by 2030 were soon scrapped. Fukushima also tipped the scales in Switzerland's decision to phase out nuclear power by 2034 and contributed to more than 94 percent of Italian voters rejecting Prime Minister Silvio Berlusconi's June referendum on renewing nuclear power.

But these were the exceptions rather than the rule; Japan, in fact, was the only formerly pro-nuclear country to experience a change of heart after the accident. The United States is reviewing its safety procedures for nuclear power, but not changing course on it; overall support for the energy source among Americans has hovered around 50 percent since the early 1990s. In France, which gets 78 percent of its electricity from nuclear power, President Nicolas Sarkozy said shutting down reactors was "out of the question." And as for China, India, and South Korea -- countries with a growing appetite for nuclear power that account for the bulk of active plant construction -- only the first has put any of its nuclear plans on pause, and that's just pending a safety review. India and South Korea have vowed to tighten safety standards, but have otherwise forged ahead with plans for nuclear expansion.

Outside Japan, it was Germany that reacted most emphatically to Fukushima, with hundreds of thousands of protesters taking to the streets and Chancellor Angela Merkel declaring a phaseout of the country's nine existing nuclear plants. But most Germans were already staunchly against nuclear power before 2011 -- a legacy not of Fukushima, but of Chernobyl, whose 1986 meltdown rained down contamination 850 miles away in Bavaria. And though Merkel's political coalition was battered in subsequent elections by Germany's anti-nuclear Greens, the erosion of her popularity had in fact begun months earlier. Nor was Merkel's phaseout decision an entirely new direction; Germany had committed more than a decade ago not to build new plants.

FREDERIC J. BROWN/AFP/Getty Images

"Nuclear Power Is an Accident Waiting to Happen."

Not necessarily. In half a century of operation, the global nuclear power industry has suffered three catastrophic accidents, all dire enough to make the plant names -- Three Mile Island, Chernobyl, and now Fukushima -- synonymous with industrial disaster. But each was a failure of organizational culture as much as technology, and the lessons learned have helped keep their specific mistakes from being repeated.

Shortly after the meltdown at Three Mile Island, the U.S. nuclear industry began an ambitious overhaul of its safety practices. The commercial sector hired nuclear experts from the U.S. Navy, which has the world's longest and least blemished track record for nuclear safety, to overhaul safety standards and create a peer-review inspection body, the Institute of Nuclear Power Operations. The United States hasn't had a meltdown since at any of its more than 100 reactors.

The Chernobyl accident seven years later was an outlier, inextricable from the pathologies of the late-Soviet-era system in which it took place: an antiquated, kludged-together reactor design without any containment structure to safeguard against worst-case scenarios and hubristic engineers who believed that nothing could go wrong, even as they drove the plant into the danger zone (ironically enough, by dragging out a safety test). Still, the disaster led to a worldwide transformation of safety standards similar to what the United States underwent after Three Mile Island, most notably with the creation of the World Association of Nuclear Operators, which has since inspected almost all 432 commercial reactors in the world.

Most recently, the Fukushima disaster was equal parts freakish bad luck (an earthquake of a huge magnitude, followed by an equally extraordinary tsunami of a size not seen in the region for hundreds of years) and a management culture that kept problems at the plant from being addressed prior to the accident. Fukushima's reactors were 32 to 40 years old, and concerns had been raised about their integrity for nearly as long as they had been up and running. Tokyo Electric Power Company's management covered up such concerns and safety violations for years, executives admitted after the accident. Japan also lacked a strong regulatory agency, as well as the independent nuclear expertise that would have been necessary to staff one.

As in the previous disasters, lessons have already been learned from Fukushima; South Korea's government has ordered the establishment of a strong regulatory agency to avoid a repeat of its neighbor's catastrophe. It would, of course, be best not to make these enormous mistakes in the first place, but we can take some comfort in the fact that so far, we have avoided repeating any of them.

FRED DUFOUR/AFP/Getty Images

"Nuclear Power Is Too Expensive."

Yes and no. In fact, nuclear power plants are relatively cheap to operate. Averaging the costs over the life of the operation, a safely run plant can even be a cash cow, generating power at as low as 6 cents per kilowatt-hour, comparable to a coal-fired power plant. The problem is getting them built. A large reactor can cost several billion dollars, and construction delays -- as well as slowdowns forced by inevitable legal challenges -- have been known to drive up construction costs by $1 million a day.

This problem is nothing new; it has plagued the industry since the 1970s. Years before the Three Mile Island disaster turned public opinion against the atom, the U.S. nuclear sector was already in trouble on account of legal and bureaucratic changes enacted under Presidents Richard Nixon, Gerald Ford, and Jimmy Carter that made new plants easier to stop with lawsuits -- usually filed by environmental and citizens' groups -- and regulations more unpredictable. That spooked investors, who in turn raised interest rates on borrowing for plant developers. The then-ongoing recession, which depressed energy demand, didn't help; neither did the plummeting price of oil and deregulation of natural gas that followed in the 1980s. Today, the industry argues that plant construction can only happen with the help of tens of billions of dollars in federal loan guarantees, which transfer financial risks onto taxpayers.

But the fact is that nuclear power has never succeeded anywhere without enormous government backing. Until 2004, the French government wholly owned Électricité de France, the utility that operates all French nuclear power plants, and the government still controls more than 80 percent of it today. The Chinese government also largely or wholly owns China's nuclear-power utilities. And nuclear is hardly the only energy source that hasn't stood up in the free market once you factor in the external costs. Consider how much of the Pentagon's $550 billion-a-year budget goes toward securing oil supplies. For a country like Japan or South Korea, with virtually no domestic energy supplies, nuclear power may be worth the upfront costs if it allows for a measure of energy security. As for the rest of us, nuclear power may also come to seem a good deal, once you factor in the risks of climate change.

JEAN-PAUL BARBIER/AFP/Getty Images

"More Nuclear Power Means More Nuclear Proliferation."

Maybe. It's true that the nuclear enrichment and reprocessing facilities used to produce fuel for peaceful reactors can just as easily be used to make fissile material for bombs. For now, however, this threat starts and ends with Iran. Most of the 30 countries that use nuclear power don't build their own enrichment or reprocessing facilities, instead buying fuel for their nuclear power plants from external suppliers. The only countries with enrichment facilities that don't have nuclear weapons as well are Argentina, Brazil, Germany, Iran, Japan, and the Netherlands -- and only one of those six keeps nonproliferation hawks up at night.

The rest of the world has been willing by and large to abide by arrangements like the 2009 deal between the United States and the United Arab Emirates (UAE). Under its terms, the UAE passed a national law banning the construction of enrichment and reprocessing facilities in exchange for access to a reliable source of nuclear fuel. Such agreements could maintain the status quo as long as the same standard is enforced across the board. Unfortunately, U.S. President Barack Obama's administration is in the process of eroding this precedent in deals it is pursuing with Jordan, Saudi Arabia, and Vietnam, which could impose less strict terms -- and possibly lead the UAE to rethink its self-imposed moratorium. In April, the U.S. House Foreign Affairs Committee unanimously passed a resolution backing legislation to make terms like those in the UAE deal the norm, but it has yet to become law.

The bad news is that the threat of peaceful nukes begetting the destructive kind is going to get worse before it gets better, thanks to technological advances. Global Laser Enrichment, a North Carolina-based firm, appears to be on the verge of commercializing a process that would use laser technology to enrich uranium. A laser enrichment facility would take up relatively little space -- it could be hidden in a single nondescript warehouse in an otherwise benign industrial park -- and emit few overt signs of activity, making it far more difficult to detect than conventional centrifuge enrichment. Successful commercialization could trigger the spread of the technology despite the company's and the United States' efforts to keep it safe. The "secret" of the nuclear bomb, after all, only lasted a few years.

BEHROUZ MEHRI/AFP/Getty Images

"Nuclear Power Can Help the World's Poorest Get on the Grid."

Not really. The two great energy challenges of the immediate future will be reducing greenhouse gas emissions worldwide and meeting the moral obligation of helping developing countries gain access to the kind of reliable energy supply that allows for transformative improvements in health, education, and overall quality of life. Expanding nuclear power, which currently provides about 14 percent of the world's electricity, may appear to offer the best means of addressing each challenge without exacerbating the other. Eight African countries, in addition to already-nuclear South Africa, are exploring plant construction. Environmental scientist James Lovelock has asserted that nuclear energy "will give civilization the chance to survive through the difficult time soon to come."

The problem is that most of the world's new electricity demand is in the developing world, and about 85 percent of today's nuclear power is limited to the most economically advanced countries. The reasons for this are easy enough to grasp: Nuclear power's start-up costs are enormous, and large plants require a robust electrical grid -- prerequisites that are by definition out of reach for the estimated 1.6 billion of the Earth's 7 billion people who have little or no reliable access to electricity. Niger may be the world's fifth-largest uranium producer, but the cost of building a reactor to make use of it would take up more than half the country's GDP.

In recent years, many in the nuclear energy industry have touted small reactors as the solution to this problem -- modular units about one-fiftieth to one-third the size of the behemoths used in today's nuclear-powered countries and that can be scaled up gradually at far lower cost. U.S. Energy Secretary Steven Chu, who says he is a "big fan" of the technology, has urged Obama to ask Congress for $39 million to jump-start its development in the United States. But small reactors cost more per kilowatt-hour than their bigger siblings to keep up and running, and they still present most of the challenges that make nuclear power logistically difficult: the need for highly trained personnel to run them safely, procedures and facilities for safely storing nuclear waste, and protection against attacks, theft of radioactive materials, and sabotage.

All of this means that for people without electricity, renewable power sources such as wind and solar will continue to provide a better hope for plugging in quickly and cleanly, as will innovations in electricity storage, whether hydrogen-run fuel cells or some innovation yet to be produced.

ISSOUF SANOGO/AFP/Getty Images

"Radioactive Waste Is the Achilles' Heel of Nuclear Power."

Wrong. Nuclear waste is a solvable problem, as long as you get the technology and the politics right -- and in that order. Radioactive materials can be kept from contaminating land and water supplies for tens of thousands of years if you bury them in the right geological formation, such as stable granite rock, or for at least a century if you put them in dry storage casks (a course that presumably offers enough time for scientists to figure out a more permanent solution). Germany's Morsleben facility, in a former rock-salt mine, has housed nuclear waste safely for three decades; at the Surry Power Station in Virginia, the cask method has worked without incident for a quarter-century.

When storage plans have gone badly, it's been because politics have trumped technical concerns and have been handled poorly. Perhaps the most notorious example is the Yucca Mountain nuclear waste repository, a planned containment complex in the Nevada desert that would have cost more than $50 billion but was scrapped amid controversy in 2009. The site was chosen in the 1980s not because it was geologically ideal for containing nuclear waste -- it wasn't -- but because Nevada's representatives in Washington were comparatively weak and were outmaneuvered by states that would have provided more and better storage locations, such as Texas. After more than $12 billion spent on the Yucca Mountain project, the Obama administration pulled the plug in a hasty, politically motivated manner that could cost taxpayers billions of dollars more and delay by at least 20 years the development of an alternative, according to an April 2011 report by the Government Accountability Office.

In contrast, consider Sweden's experience with the Forsmark nuclear power plant. When the Swedes set about planning their nuclear waste storage facility three decades ago, they faced significant opposition from a public that was skittish about nuclear power. But government and industry alike took the opposite tack from that of the United States, ensuring that stakeholders ranging from Greenpeace to citizens' groups to the nuclear industry were included in discussions. Many locations were up for scientific investigation and public debate, and the process of choosing one was transparent and based on the best geological information. The storage facility is planned to be fully operational in 2020 and expected to last for 100,000 years. It's the lesson of the meltdowns all over again: The biggest risks posed by nuclear power come not from the technology, but from the human institutions that govern how we use it.

Scott Olson/Getty Images 

"Windmills Can Replace Reactors."

Not for decades to come. In an ideal world, our energy supply wouldn't come with the asterisks of planet-imperiling climate change on one hand or waste that stays hazardous for thousands of years on the other -- and this, of course, is the promise of renewable energy. It's true that renewable technologies have made great strides in recent years; in fact, they're the fastest-growing energy sector, with solar photovoltaic capacity expanding an average of 40 percent a year since 2000 and wind power growing an average of 27 percent annually since 2004.

But context matters. These are still strictly niche sources, and even today they still account for only 3 percent of the world's electricity portfolio. Solar energy still requires major government subsidies to reach cheaper prices and greater economies of scale; $535 million in U.S. Energy Department grants wasn't enough to save solar panel manufacturer Solyndra, which declared bankruptcy in August. Until smart-grid technologies and energy storage systems improve and spread widely, wind and solar energy will be too intermittent to provide anything like the reliable base-load power offered by nuclear and fossil fuels. Hydropower plays a significant role in the energy mix of the United States and several other countries, but environmental concerns about the damage caused by dams have severely limited its growth.

In short, all energy supplies come with drawbacks -- not least nuclear, which since its inception has been haunted by its early boosters' starry-eyed projections of incredibly cheap and abundant energy that have yet to come to pass. As we look at all of the energy sources available to us, we need to understand and face these costs and risks honestly. Doing so is the first step toward realizing that we can no longer demand more and more energy without being willing to pay the price.

BORIS HORVAT/AFP/Getty Images