A View from Gerald Faulhaber and David Farber
Are We Really Saving the Open Internet?
Want regulations to preserve the open Internet? Be careful what you wish for.
Demands for network neutrality have reached fever pitch in Washington, D.C., as many voices stress the need for the Federal Communications Commission to save our open Internet. They claim that broadband Internet service providers can block data flow from selected websites, charging content providers for delivering content to customers and establishing paid “fast lanes” for some and slow lanes for everyone else (see “The Right Way to Fix the Internet”). Is the Internet suddenly in great danger?
The term “network neutrality” was coined by a legal scholar in 2002, harking back to the seminal paper “End-to-End Arguments in System Design,” which called for network operators to be “dumb pipes” carrying the bits they are given with no changes whatsoever. After decades of pledging “hands off the Internet,” the FCC took up the network neutrality challenge and issued its first order in 2010. Although only two violations had been documented, the FCC went ahead with “prophylactic” regulations. This order was struck down by the D.C. Circuit Court on jurisdictional grounds, and the FCC is going back for a second round, leading to the current brouhaha.
In the first order, the FCC ruled that ISPs could not block or delay content and could not discriminate among content providers—none would be allowed to pay for priority delivery of their traffic. The regulators’ view was that these rules simply reflected current best practices among ISPs. If that’s true, the FCC was at the very least freezing the current status of the dynamic Internet, ensuring that this constantly evolving network could evolve no more. But was the FCC’s view even correct? David Clark of MIT, an early chief protocol architect of the Internet, has said that “the network is not neutral and never has been,” dismissing the assumptions of net neutrality supporters as “happy little bunny rabbit dreams.” Early Internet operators routinely discriminated in favor of traffic that was sensitive to latency, and similar options are available today. The phenomenal success of the Internet suggests that the technologists who have been running it really don’t need help.
But what could go wrong if the FCC decides to put a few rules in place? Plenty. The history of telecommunications regulation tells a sorry story of glacial decision-making focusing on yesterday’s problems, inhibition of innovation, and, worst of all, what economists call “rent-seeking”—businesses’ use of the regulatory process to put their competitors at a disadvantage.
Yes, the open Internet is in danger. But not from lack of neutrality—from the prospect of the FCC regulating it like a 20th-century utility.
Gerald Faulhaber is professor emeritus at the Wharton School of the University of Pennsylvania. David Farber, former chief technologist for the FCC, is an emeritus professor at Penn.