Intelligent Machines

The Digital Utility

Nicholas Carr’s new book examines the implications of cloud computing.

In the end, as the story of the emperor’s new clothes reminds us, somebody has to break the spell. In May 2003, ­Nicholas Carr cast himself in the naysayer’s role by publishing an article titled “IT Doesn’t Matter” in the Harvard Business Review. In 2004 he followed that with a book, Does IT Matter? Information Technology and the Corrosion of Competitive Advantage. Thereby, he aroused the ire of the good and the great in Silicon Valley and Redmond, WA.

For that, he won a little fame. Now he has a new book, The Big Switch: Rewiring the World, from Edison to Google, which will almost certainly influence a large audience. Carr persuasively argues that we’re moving from the era of the personal computer to an age of utility computing–by which he means the expansion of grid computing, the distribution of computing and storage over the Internet, until it accounts for the bulk of what the human race does digitally. And he nicely marshals his historical analogies, detailing how electricity delivered over a grid supplanted the various power sources used during most of the 19th century. Many readers may find his conclusions unconvincingly dark. I think he could have borne in mind the old joke: predicting is hard, especially about the future. That said, I also suspect he’s right to suggest that in a decade or so, many things we now believe permanent will have disappeared.

Given that Carr’s conclusions are controversial, it’s helpful to trace his thesis in full. In “IT Doesn’t Matter,” he argued that as industries mature, the products or services they supply become commodities that compete on price alone. The information tech­nology industry, he continued, had arrived at that phase: for most companies that did not themselves develop and sell IT, information technology offered no competitive advantage and was just another cost of doing business. It wasn’t hard to find evidence for Carr’s contention. A business school truism since Clayton Christensen’s 1997 book The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail is that you can tell a sector has been commodified when competition has created a “performance oversupply,” where almost any product differentiation is unwanted. And indeed, by sometime before the 20th century’s end, the vast majority of PCs had far more processing and storage ­capacity than their users needed for the most common tasks: e-mail, Web browsing, word processing. In fact, Carr pointed out, 70 percent of a typical Windows network’s storage ­capacity went unused.

By 2000, Carr claimed, close to 50 percent of American companies’ annual capital expenditures went to IT: every year, U.S. businesses acquired more than 100 million new PCs. The biggest IT-associated business risk that companies faced, he concluded, was overspending. It was time for businesses to “explore cheaper solutions, including open-source applications and bare-bones network PCs,” he argued. “If a company needs evidence of the kind of money that might be saved, it need only look at Microsoft’s profit margin.”

THE BIG SWITCH: REWIRING THE WORLD, FROM EDISON TO GOOGLE
Nicholas Carr
W. W. Norton, 2008
$25.95

Naturally, the industry’s chieftains poured scorn on this thesis. Microsoft’s CEO, Steve Ballmer, blustered that there was still plenty of life in l’ancien régime: “Our fundamental response is: hogwash. We look out there like kids in a candy store saying what a great world we live in.” Even ­Ethernet coinventor Bob Metcalfe, who might have maintained an Olympian detachment, weighed in to complain in this magazine that “Carr’s article just won’t stay debunked” (see “Why IT Matters,” June 2004). As evidence of Carr’s wrongheadedness, Metcalfe cited the expansion of the Ethernet into ever newer, wider, and faster networking realms, thus arguably missing Carr’s point. [Metcalfe is a member of Technology Review’s board of directors.]

Carr was saying that, like previous technologies such as the telephone and elec­tricity, IT no longer conferred any competitive advantage because it was now part of the general business infrastructure. Next, IT would become a simple utility, provided to users over the networks that ­Metcalfe had helped make possible. Today, of course, Carr’s thesis is the accepted wisdom: almost everybody agrees that IT services will eventually be delivered on a subscription basis, as a ­utility. As The Big Switch observes, this is why Google has been constructing gigantic server farms in rural sites in Oregon, the Carolinas, Oklahoma, Georgia, and Iowa. Elsewhere, similar data centers have been or are being built by Microsoft, IBM, Hewlett-Packard, Yahoo, Ask.com, and Salesforce.com.

The retail giant Amazon has offered the most comprehensive utility-computing services thus far. It had already introduced its EC2 (Elastic Compute Cloud, where customers run software on Amazon’s systems) and S3 (Simple Storage Service, where customers store data for a few cents per gigabyte) when it recently launched SimpleDB, a website that provides metered database capabilities.

I asked Werner Vogels, Amazon’s chief technical officer, whether we were truly in the era of the serverless Internet company that could be run through a browser. Vogels said that he took that as settled, given how many startups were happier paying cents per gigabyte to Amazon than investing in hardware costing hundreds of thousands of dollars.

In The Big Switch, Carr notes the prospective benefits of a world of utility computing, but he also plays the naysayer again. Nearly half the book describes the possible dystopian aspects of such a world. What are these, in his view?

First, the destruction of traditional businesses by the extremely lean companies that utility computing makes possible. Second, the ease with which governments and corporations will be able to track and exploit our digital behavior. Third, the emergence of a “YouTube economy” in which many will provide free information to the “cloud,” and a few aggregators will harvest most of the profits. Fourth, the deterioration of human culture as people come to rely on the Internet to know and do everything, while they know and do little themselves. Fifth, the continuing fracturing of civil society as people choose to read or hear only the news that confirms their prejudices.

Carr’s predictions vary in plausibility. Overall, though, they can be separated into two categories: on the one hand, futuristic scenarios that may or may not tip over into reality; on the other, scenarios that amount to what the great political economist Peter Drucker called “the future that has already happened.” Drucker, who died in 2005, used to maintain that while trying to predict the future was pointless, it was possible to identify ongoing trends that would have significant future effects.

Drucker described his modus operandi thus: “I look out the window at things that are going on, things that have already happened that people pay no attention to.” That methodology led Drucker to the conclusion that the Knowledge Economy was succeeding the Industrial, with the obvious collateral being the rise of the knowledge worker, a term Drucker was the first to use. When Nicholas Carr wrote “IT Doesn’t Matter,” he was doing Drucker’s kind of analysis, looking out the window and identifying a future that had already happened.

In his latest book, Carr has extrapolated similarly from ongoing trends. At many small to midsize companies, not a few executives will be thinking, “We could reduce the IT department to one or two people.” IT is a cost center, after all, not so dissimilar from janitorial and cafeteria services, both of which have long been outsourced at most enterprises. Security concerns won’t necessarily prevent companies from wholesale outsourcing of data services: businesses have long outsourced payroll and customer data to trusted providers. Much will depend on the specific company, of course, but it’s unlikely that smaller enterprises will resist the economic logic of utility computing. Bigger corporations will simply take longer to make the shift.

Though some IT managers will retrain and find work in the new data centers, such places will offer fewer jobs than they displace: for instance, informed accounts place the number of employees at Google’s flagship data center in Oregon at only around 200. Similarly, entrepreneurially inclined IT managers may join startups developing innovative technologies. Again, though, the opportunities will be limited: most aspiring entrepreneurs fail. It’s hard to avoid the conclusion that many IT managers–the emblematic
category of knowledge worker, long assumed to be safe from the technologically fueled economic disruptions that have eliminated so many jobs–will proba­bly lose their livelihoods.

Mark Williams, a contributing editor for ­Technology Review, lives in Oakland, CA.