Amid the chatter about Facebook’s recent initial public offering of stock—much of it determinedly superficial (was the IPO bungled by its underwriters?); some of it substantive, at least for investors (what was an appropriate valuation for the company?); and a quantum clarifying (see Michael Wolff’s review “The Facebook Fallacy,” which argues that the company lacks “the big idea”)—left unspoken was the general assumption that the social network was monstrous.
Facebook is disquieting for several reasons: because it is the largest network ever to be controlled by a single company; because its corporate values (insofar as they can be guessed from its technologies and mission “to make the world more open and connected”) suggest a commitment to a “radical transparency” that is new to human affairs; and because, as we entrust more of our personal information to its databases (“dumb fucks,” the company’s chief executive, Mark Zuckerberg, once called his early customers), we are learning that we are its principal assets. (To learn how Facebook hopes to use its 900 million users to market products and services, see “You Are the Ad,” May/June 2011.)
The planetary scale of Facebook’s network, its ideology, and its store of data have suggested an irresistible idea to the company’s technologists: that they should conduct experiments upon humanity in real time. So far, those experiments have been benign, but the fact that they are being conducted at all has the jarring, science-fictional strangeness of the truly novel.
This issue’s cover story, “What Facebook Knows,” reveals the group conducting the experiments: the Data Science Team, whose 12 researchers form “a kind of Bell Labs for the social-networking age.” The feature’s author, Tom Simonite, writes that the members “apply math, programming skills, and social science to mine our data for insights that they hope will advance Facebook’s business.”
The group is led by Cameron Marlow, who is quick to minimize the impact of his work on the larger world. “Marlow says his team wants to divine the rules of online social life to understand what’s going on inside Facebook, not to develop ways to manipulate it. ‘Our goal is not to change the pattern of communication in society,’ he says.” But Simonite is skeptical: “Some of his team’s work and the attitudes of Facebook’s leaders show that the company is not above using its platform to tweak users’ behavior. Unlike academic social scientists, Facebook’s employees have a short path from an idea to an experiment on hundreds of millions of people.”
Simonite provides some examples of the tweaks possible. They are blameless, if spooky. But as a publicly traded company, Facebook is now dedicated to returning value to its investors. Its interests are not those of its users.
The frontispiece to the original, 1651 edition of Thomas Hobbes’ Leviathan was an etching by Abraham Bosse showing a crowned giant emerging from behind a hill: his torso and arms were composed of more than 300 figures, turned inward. It is an apt visual metaphor for the emerging commonwealth of Facebook.
Mark Zuckerberg has sometimes compared Facebook to a nation. (If it were one, it would be the third most populous on the planet.) At the moment, we have no theory of how such a virtual commonwealth should be governed, or what obligations the crown has to its people. But write and tell me what you think at jason.pontin@technologyreview.com.