Why the tech industry's unbridled optimism won't save the world.
Next year marks the 50th birthday of the first commercially produced silicon microchip, a device that has arguably reordered society more profoundly than any invention since the internal combustion engine. It will also be the 46th birthday of Moore's Law, the prediction coined by the billionaire co-founder of Intel, Gordon Moore, who more or less told us this would happen.
In an April 1965 essay in the trade journal Electronics, Moore predicted that the number of transistors that could fit onto a single chip would double every year -- later revised to every 18 to 24 months -- causing the power of chips to increase, and their price to fall, exponentially over time. It was an audacious claim, and one that proved accurate enough for Moore's Law to make the leap from obscure technical rule of thumb to omnipresent nugget of pop business philosophy, the kind of thing that MBAs love to invoke alongside Sun Tzu's battlefield aphorisms and Malcolm Gladwell's tipping point.
Moore's Law was a perfect fit with the broader zeitgeist of the early '00s. When the United States invaded Iraq with shock and awe, ABC News technology columnist Michael S. Malone crowed that Defense Secretary Donald Rumsfeld's military strategy, favoring high-tech weapons over heavy troop deployments, demonstrated how the "U.S. military has now jumped aboard Moore's Law." A 2004 editorial in the British Journal of Anaesthesia argued that a Moore's Law-like effect was at work in cardiovascular science and medicine, "both of which are advancing at a similar breakneck pace." And environmentally minded politicians like former U.S. Vice President Al Gore have made a habit of invoking it in the service of the most pressing technological struggle of the early 21st century: the effort to wean humanity off its reliance on fossil fuels.
But is the silicon chip really such a great model for solving the world's problems? Moore's prediction, formulated when its author was a 36-year-old researcher working at a Palo Alto semiconductor laboratory, was never the immutable natural principle its most ardent apostles believed it to be. "There's nothing law-like about Moore's Law," says W. Patrick McCray, a historian of technology at the University of California, Santa Barbara. What it was, first and foremost, was an economic argument.
When Moore wrote his essay, the silicon microchip was one of several possible technologies jostling in the primordial soup of midcentury computer science. Moore was arguing that an investment in silicon-based chips then under development at labs like his own would pay off faster, and more dramatically, than an investment in the alternatives. The particulars may have been dry, but Moore's Law struck a peculiarly American chord, with its promise of a rapturously unlimited future and its swaggeringly specific confidence. It had a whiff of Manifest Destiny, and a bit of Babe Ruth pointing to the bleachers at Wrigley Field.
It became a spectacularly successful self-fulfilling prophecy. By 2000, the integrated circuit had made semiconductor manufacturing into a $200 billion-plus industry and transformed the broader economy dramatically enough for U.S. Federal Reserve Chairman Alan Greenspan to remark that "information innovation lies at the root of productivity and economic growth." Moore's prediction of endless computing power and ever-shrinking prices, meanwhile, largely came to pass. A microchip in the early 1970s contained a couple thousand transistors, each of which cost just shy of a dollar. Intel's Xeon processor, introduced in 2007, is loaded with 820 million transistors, each one less than 1/800th the width of the thinnest human hair and costing less than 1/100,000th of a cent.
The exponential growth had a profound effect on how the Silicon Valley entrepreneurs of the 1980s and 1990s viewed the world. In the 2000s, many used their newly minted fortunes to start enterprises with the aim of tackling the world's big technological problems, the sort of things they had dreamed about in their geeky and idealistic youth. The multibillionaire founders of Amazon and Microsoft launched commercial spaceflight ventures. PayPal's Elon Musk started the electric car company Tesla Motors. Former dot-com executives founded Complete Genomics, a firm that hopes to transform genetic sequencing into an affordable consumer service, like getting an X-ray. And the venture capitalists who funded the 1990s tech boom have embraced the clean-energy business en masse, investing $12 billion from 2001 to 2009 in companies hoping to devise and market alternatives to oil and coal.
"We can already see a Moore's Law dynamic operating in the energy sector," John Denniston, a partner at Kleiner Perkins Caufield & Byers -- the venture-capital firm that bankrolled Google and is now a major clean-energy investor -- told a U.S. Senate panel in 2007, "giving us confidence [that] the rate of green-tech performance improvement and cost reduction will offer new energy solutions we can't even imagine right now." Vinod Khosla, the billionaire founder of Sun Microsystems and now among the most vocal clean-energy investors, wrote in Wired about the growth he expected in the capabilities of plant-derived fuels: "Like Moore's Law, this trajectory tracks a steady increase in performance, affordability, and, importantly, yield per acre of farmland."
Such analogies are the information economy's most seductive legacy, and its most perilous. Few other industries are so elegantly reducible to the movement of electrons; the cost and speed of sequencing the human genome, for instance, might be proceeding at a Moore's Law-like pace, but that doesn't mean that human life expectancy is, too. Photovoltaic solar cells are getting better and cheaper, but not exponentially so. And the high-powered batteries used in electric cars are rapidly approaching physical barriers to advancement.
In reality, Moore's Law had little to do with the kind of breakthroughs we typically associate with innovation: The prophecy held because the industry committed to a massive strategic investment in manufacturing, and the technology itself allowed for predictable, incremental processing gains. Even Microsoft founder Bill Gates has come to acknowledge the limits of the analogy. "We've all been spoiled and deeply confused by the IT model," he said this summer when asked about the relevance of Moore's Law to energy technology.
Perhaps the greatest irony of Moore's Law is that it has enabled the rise of Asian economies whose rapid ascent has reminded us that our most confounding problems are not technological, but human. By 2030, the massive expansion of China's and India's appetites, along with more prosaic growth elsewhere in the world, is expected to create a need for three times as many cars and twice as much food, plus shelter for more than one and a half times today's urban population -- and there's no Moore's Law for the oil rig, the cornfield, or the steel mill.
None of this means that progress is impossible. What it means is that progress will be difficult, and it won't look like it did in the 1990s. "Moore's Law, to a degree, has a psychological resonance," says computer scientist and futurist Jaron Lanier. "There's such a rush to it; it's a pleasure -- we want things to go on forever. But they don't."
ISTVAN BARA/KEYSTONE/GETTY IMAGES