The sixth most widely used website in the world is not run anything like the others in the top 10. It is not operated by a sophisticated corporation but by a leaderless collection of volunteers who generally work under pseudonyms and habitually bicker with each other. It rarely tries new things in the hope of luring visitors; in fact, it has changed little in a decade. And yet every month 10 billion pages are viewed on the English version of Wikipedia alone. When a major news event takes place, such as the Boston Marathon bombings, complex, widely sourced entries spring up within hours and evolve by the minute. Because there is no other free information source like it, many online services rely on Wikipedia. Look something up on Google or ask Siri a question on your iPhone, and you’ll often get back tidbits of information pulled from the encyclopedia and delivered as straight-up facts.
Yet Wikipedia and its stated ambition to “compile the sum of all human knowledge” are in trouble. The volunteer workforce that built the project’s flagship, the English-language Wikipedia—and must defend it against vandalism, hoaxes, and manipulation—has shrunk by more than a third since 2007 and is still shrinking. Those participants left seem incapable of fixing the flaws that keep Wikipedia from becoming a high-quality encyclopedia by any standard, including the project’s own. Among the significant problems that aren’t getting resolved is the site’s skewed coverage: its entries on Pokemon and female porn stars are comprehensive, but its pages on female novelists or places in sub-Saharan Africa are sketchy. Authoritative entries remain elusive. Of the 1,000 articles that the project’s own volunteers have tagged as forming the core of a good encyclopedia, most don’t earn even Wikipedia’s own middle-ranking quality scores.
The main source of those problems is not mysterious. The loose collective running the site today, estimated to be 90 percent male, operates a crushing bureaucracy with an often abrasive atmosphere that deters newcomers who might increase participation in Wikipedia and broaden its coverage.
In response, the Wikimedia Foundation, the 187-person nonprofit that pays for the legal and technical infrastructure supporting Wikipedia, is staging a kind of rescue mission. The foundation can’t order the volunteer community to change the way it operates. But by tweaking Wikipedia’s website and software, it hopes to steer the encyclopedia onto a more sustainable path.
The foundation’s campaign will bring the first major changes in years to a site that is a time capsule from the Web’s earlier, clunkier days, far removed from the easy-to-use social and commercial sites that dominate today. “Everything that Wikipedia is was utterly appropriate in 2001 and it’s become increasingly out of date since,” says Sue Gardner, executive director of the foundation, which is housed on two drab floors of a downtown San Francisco building with a faulty elevator. “This is very much our attempt to get caught up.” She and Wikipedia’s founder, Jimmy Wales, say the project needs to attract a new crowd to make progress. “The biggest issue is editor diversity,” says Wales. He hopes to “grow the number of editors in topics that need work.”
Whether that can happen depends on whether enough people still believe in the notion of online collaboration for the greater good—the ideal that propelled Wikipedia in the beginning. But the attempt is crucial; Wikipedia matters to many more people than its editors and students who didn’t make time to read their assigned books. More of us than ever use the information found there, both directly and via other services. Meanwhile, Wikipedia has either killed off the alternatives or pushed them down the Google search results. In 2009 Microsoft closed Encarta, which was based on content from several storied encyclopedias. Encyclopaedia Britannica, which charges $70 a year for online access to its 120,000 articles, offers just a handful of free entries plastered with banner and pop-up ads.
Newcomers Unwelcome
When Wikipedia launched in 2001, it wasn’t intended to be an information source in its own right. Wales, a financial trader turned Internet entrepreneur, and Larry Sanger, a freshly minted philosophy PhD, started the site to boost Nupedia, a free online encyclopedia started by Wales that relied on contributions from experts. After a year, Nupedia offered a strange collection of only 13 articles on such topics as Virgil and the Donegal fiddle tradition. Sanger and Wales hoped Wikipedia, where anyone could start or modify an entry, would rapidly generate new articles that experts could then finish up.
When they saw how enthusiastically people embraced the notion of an encyclopedia that anyone could edit, Wales and Sanger quickly made Wikipedia their main project. By the end of its first year it had more than 20,000 articles in 18 languages, and its growth was accelerating fast. In 2003, Wales formed the Wikimedia Foundation to operate the servers and software that run Wikipedia and raise money to support them. But control of the site’s content remained with the community dubbed Wikipedians, who over the next few years compiled an encyclopedia larger than any before. Without any traditional power structure, they developed sophisticated workflows and guidelines for producing and maintaining entries. Their only real nod to hierarchy was electing a small group of “administrators” who could wield special powers such as deleting articles or temporarily banning other editors. (There are now 635 active admins on the English Wikipedia.)
The project seemed laughable or shocking to many. Wikipedia inherited and embraced the cultural expectations that an encyclopedia ought to be authoritative, comprehensive, and underpinned by the rational spirit of the Enlightenment. But it threw out centuries of accepted methods for attaining that. In the established model, advisory boards, editors, and contributors selected from society’s highest intellectual echelons drew up a list of everything worth knowing, then created the necessary entries. Wikipedia eschewed central planning and didn’t solicit conventional expertise. In fact, its rules effectively discouraged experts from contributing, given that their work, like anyone else’s, could be overwritten within minutes. Wikipedia was propelled instead by the notion that articles should pile up quickly, in the hope that one Borgesian day the collection would have covered everything in the world.
Progress was swift. The English-language Wikipedia alone had about 750,000 entries by late 2005, when a boom in media coverage and a spike in participation pushed the project across the line from Internet oddity to part of everyday life. Around that time, Wikipedians achieved their most impressive feat of leaderless collective organization—one, it turns out, that set in motion the decline in participation that troubles their project today. At some time in 2006, the established editors began to feel control of the site slipping from their grasp. As the number of new contributions—well-meaning and otherwise—was growing, the task of policing them all for quality began to feel impossible. Because of Wikipedia’s higher public profile and commitment to letting anyone contribute even anonymously, many updates were pure vandalism. High-profile incidents such as the posting of a defamatory hoax article about the journalist John Seigenthaler raised serious questions about whether crowdsourcing an encyclopedia, or anything else, could ever work.
As is typical with Wikipedians, a response emerged from a mixture of cordial discussions, tedious arguments, and online wrestling matches—but it was sophisticated. The project’s most active volunteers introduced a raft of new editing tools and bureaucratic procedures intended to combat the bad edits. They created software that allowed fellow editors to quickly survey recent changes and reject them or admonish their authors with a single mouse click. They set loose automated “bots” that could reverse any incorrectly formatted changes or those that were likely to be vandalism and dispatch warning messages to the offending editors.
The tough new measures worked. Vandalism was brought under control, and hoaxes and scandals became less common. Newly stabilized, and still growing in scope and quality, the encyclopedia became embedded in the firmament of the Web. Today the English Wikipedia has 4.4 million articles; there are 23.1 million more in 286 other languages. But those tougher rules and the more suspicious atmosphere that came along with them had an unintended consequence. Newcomers to Wikipedia making their first, tentative edits—and the inevitable mistakes—became less likely to stick around. Being steamrollered by the newly efficient, impersonal editing machine was no fun. The number of active editors on the English-language Wikipedia peaked in 2007 at more than 51,000 and has been declining ever since as the supply of new ones got choked off. This past summer only 31,000 people could be considered active editors.
“I categorize from 2007 until now as the decline phase of Wikipedia,” says Aaron Halfaker, a grad student at the University of Minnesota who has worked for the Wikimedia Foundation as a contractor and this year published the most detailed assessment of the problem. “It looks like Wikipedia is strangling itself for this resource of new editors.”
Halfaker’s study, which he conducted with a Minnesota colleague and researchers from the University of California, Berkeley, and the University of Washington, analyzed Wikipedia’s public activity logs. The results paint a numerical picture of a community dominated by bureaucracy. Since 2007, when the new controls began to bite, the likelihood of a new participant’s edit being immediately deleted has steadily climbed. Over the same period, the proportion of those deletions made by automated tools rather than humans grew. Unsurprisingly, the data also indicate that well-intentioned newcomers are far less likely to still be editing Wikipedia two months after their first try.
In their paper on those findings, the researchers suggest updating Wikipedia’s motto, “The encyclopedia that anyone can edit.” Their version reads: “The encyclopedia that anyone who understands the norms, socializes him or herself, dodges the impersonal wall of semi-automated rejection and still wants to voluntarily contribute his or her time and energy can edit.”
Because Wikipedia has failed to replenish its supply of editors, its skew toward technical, Western, and male-dominated subject matter has persisted. In 2011, researchers from the University of Minnesota and three other schools showed that articles worked on mostly by female editors—which presumably were more likely to be of interest to women—were significantly shorter than those worked on mostly by male editors or by men and women equally. Another 2011 study, from the University of Oxford, found that 84 percent of entries tagged with a location were about Europe or North America. Antarctica had more entries than any nation in Africa or South America.
The Upgrade
When asked about the decline in the number of editors, Gardner carefully explains that she is addressing it only as a precaution, because there’s no proof it is harming Wikipedia. But after a few minutes discussing the issue, it is clear that she believes Wikipedia needs help. A career journalist who headed the Canadian Broadcasting Corporation’s online operations before taking her current position, Gardner reaches for an analogy from the newsroom to explain why the trend matters. “The Wikipedians remind me of the crusty old desk guy who knows the style guide backwards,” she says. “But where are the eager cub reporters? You don’t get the crusty old desk guy out at three in the morning to cover a fire. That’s for the new guy, who’s got a lot of energy and potential. At Wikipedia we don’t have a sufficient influx of cub reporters.”
In 2012 Gardner formed two teams—now called Growth and Core Features—to try to reverse the decline by making changes to Wikipedia’s website. One idea from the researchers, software engineers, and designers in these groups was the “Thank” button, Wikipedia’s answer to Facebook’s ubiquitous “Like.” Since May, editors have been able to click the Thank button to quickly acknowledge good contributions by others. It’s the first time they have been given a tool designed solely to deliver positive feedback for individual edits, says Steven Walling, product manager on the Growth team. “There have always been one-button-push tools to react to negative edits,” he says. “But there’s never been a way to just be, like, ‘Well, that was pretty good, thanks.’” Walling’s group has focused much of its work on making life easier for new editors. One idea being tested offers newcomers suggestions about what to work on, steering them toward easy tasks such as copyediting articles that need it. The hope is this will give people time to gain confidence before they break a rule and experience the tough side of Wikipedia.
These might seem like small changes, but it is all but impossible for the foundation to get the community to support bigger adjustments. Nothing exemplifies this better than the effort to introduce the text editing approach that most people are familiar with: the one found in everyday word processing programs.
Since Wikipedia began, editing has required using “wikitext,” a markup language painful to the untrained eye. It makes the first sentence of Wikipedia’s entry for the United States look like this:
The ”’United States of America”’ (”’USA”’ or ”’U.S.A.”’), commonly referred to as the ”’United States”’ (”’US”’ or ”’U.S.”’) and ”’America”’, is a [[federal republic]]<ref>{{cite book |title=The New York Times Guide to Essential Knowledge, Second Edition: A Desk Reference for the Curious Mind |year=2007 |publisher=St. Martin’s Press |isbn=978-0312376598 |page=632}}</ref><ref>{{cite book|last=Onuf|first=Peter S.|title=The Origins of the Federal Republic: Jurisdictional Controversies in the United States, 1775–1787|year=1983|publisher=University of Pennsylvania Press |location= Philadelphia |isbn=978-0812211672}}</ref> consisting of 50 [[U.S. state|states]] and a [[Federal district (United States)|federal district]].
After years of planning, the foundation finally unveiled Visual Editor, an interface that hides the wikitext and offers “what you see is what you get” editing. It rolled out in a site-wide trial in July, with the expectation that it would soon become a permanent fixture.
But in the topsy-turvy world of the encyclopedia anyone can edit, it’s not a fringe opinion that making editing easier is a waste of time. The characteristics of a dedicated volunteer editor—Gardner lists “fussy,” “persnickety,” and “intellectually self-confident”—are not those that urge the acceptance of changes like Visual Editor.
After the foundation made Visual Editor the default way to edit entries, Wikipedians rebelled and complained of bugs in the software. In September, a Request for Comment, a survey of the community, concluded that the new interface should be hidden by default. The foundation initially refused, but in September a community–elected administrator released a modification to Wikipedia’s code to hide Visual Editor. The foundation gave in. It made Visual Editor opt-in rather than opt-out—meaning that the flagship project to help newcomers is in fact invisible to newcomers, unless they dig through account settings to switch the new interface on.
Many opponents of Visual Editor dispute the idea that it will help Wikipedia. “I don’t think this is the cure the foundation’s looking for,” says Oliver Moran, an Irish software engineer who has made thousands of edits since 2004 and is a top administrator. Like some other vocal Wikipedians, he considers it patronizing to say that wikitext keeps out certain people. “Look at something like Twitter,” he says. “People pick up the hashtags and @ signs straight away.” Much criticism of Visual Editor is also underpinned by a feeling that it proves the foundation is happy to make unilateral changes to a supposedly collaborative project. Moran says Visual Editor was rolled out without enough input from the people providing the voluntary labor Wikipedia is built on.
When asked to identify Wikipedia’s real problem, Moran cites the bureaucratic culture that has formed around the rules and guidelines on contributing, which have become labyrinthine over the years. The page explaining a policy called Neutral Point of View, one of “five pillars” fundamental to Wikipedia, is almost 5,000 words long. “That is the real barrier: policy creep,” he says. But whatever role that plays in Wikipedia’s travails, any effort to prune its bureaucracy is hard to imagine. It would have to be led by Wikipedians, and the most active volunteers have come to rely on bureaucratic incantations. Citing “WP:NPV” (the neutral point of view policy) or threatening to take a matter to ARBCOM (the arbitration committee for dispute resolution) in a way that suggests you know a lot about such arcana is easier than having a more substantive discussion.
This is not to say all Wikipedians disagree with the Wikimedia Foundation’s assessment of the site’s problems and its ideas for addressing them. But even grassroots initiatives to help Wikipedia can’t escape the community’s tendency to get bogged down in navel-gazing arguments.
In July 2012, some editors started a page called WikiProject Editor Retention with the idea of creating a place to brainstorm ideas about helping newcomers and fostering a friendlier atmosphere. Today the most vibrant parts of that project’s discussion page have gripes about “bullying done by administrators,” debates over whether “Wikipedia has become a bloody madhouse,” and disputes featuring accusations such as “You registered an account today just to have a go at me?”
Public Good
Even though Wikipedia has far fewer active editors than it did in its heyday, the number and length of its articles continue to grow. This means the volunteers who remain have more to do, and Gardner says she can sense the effects: “Anecdotally, the editing community has a sense of feeling a little bit beleaguered and overworked.” A 2011 survey by the Wikimedia Foundation suggested that being an active editor already required a significant time commitment. Of 5,200 Wikipedians from all language editions of the project, 50 percent contributed more than one hour a day, and 20 percent edited for three or more hours a day. Wikipedia’s anti-abuse systems are probably effective enough to keep vandalism in check, says Halfaker, but the more complex work of improving, expanding, and updating articles may suffer: “When there’s fewer people working, less work gets done.”
When the topic of quality comes up, anyone affiliated with Wikipedia often points out that it is “a work in progress.” But such caveats aren’t very meaningful when the project’s content is put to use. When Google’s search engine puts Wikipedia content into a fact box to answer a query, or Apple’s Siri uses it to answer a question, the information is presented as authoritative. Google users are invited to report inaccuracies, but only if they spot and then click an easy-to-miss link to “feedback/more info.” Even then, the feedback goes to Google, not to Wikipedia itself.
Jimmy Wales, now just a regular Wikipedian but still influential with editors and the Wikimedia Foundation, dismisses suggestions that the project will get worse. But he believes it can’t get significantly better without an influx of new editors who have different interests and emphases. “When you look at the article on the USB standard, you see it is really amazing and core to our competency as a tech geek community, but look at an entry about somebody famous in sociology, or Elizabethan poets, and it is quite limited and short and could be improved,” he says. “That’s not likely to happen until we diversify the community.” Wales hopes Visual Editor will do that by attracting people who are similar to those already editing the site but have interests beyond the male- and tech-centric—as he puts it, “geeks who are not computer geeks.” But he admits to worrying that making Wikipedia simpler to edit could instead confirm that the project doesn’t appeal to people who are not computer geeks.
Indeed, larger cultural trends will probably make it a challenge to appeal to a broader section of the public. As commercial websites have risen to prominence, online life has moved away from open, self-governed crowdsourcing communities like the one that runs Wikipedia, says Clay Shirky, a professor in the Interactive Telecommunications Program at New York University. Shirky was one of the biggest boosters of an idea, popular during the previous decade, that the Web encouraged strangers to come together and achieve things impossible for a conventional organization. Wikipedia is proof there was some truth to that notion. But today’s Web is dominated by sites such as Facebook and Twitter, where people maintain personal, egocentric feeds. Outside specific settings like massive multiplayer games, relatively few people mingle in shared virtual space. Instead, they use mobile devices that are unsuited to complex creative work and favor neatly self-contained apps over messier, interconnected Web pages. Shirky, who is an advisor to the Wikimedia Foundation, says people steeped in that model will struggle to understand how and why they should contribute to Wikipedia or any project like it. “Facebook is the largest participatory culture today, but their mode of participation is different,” he says. “It’s aggregating rather than collaborating.”
Gardner agrees that today’s Web is hostile to self-organized collective efforts, likening it to a city that has lost its public parks. “Our time is spent on an increasingly small number of increasingly large corporate sites,” she says. “We need more public space online.” In fact, Gardner is leaving the foundation at the end of the year in search of new projects to work on that very problem. She contends that even with all its troubles, Wikipedia is one of the Web’s few public parks that won’t disappear.
She is surely right that Wikipedia isn’t going away. On Gardner’s watch, the funds the Wikimedia Foundation has raised each year to support the site have grown from $4 million to $45 million. Because the encyclopedia has little competition, Web developers will continue to build services that treat its content as fact, and ordinary people will rely on Wikipedia for information.
Yet it may be unable to get much closer to its lofty goal of compiling all human knowledge. Wikipedia’s community built a system and resource unique in the history of civilization. It proved a worthy, perhaps fatal, match for conventional ways of building encyclopedias. But that community also constructed barriers that deter the newcomers needed to finish the job. Perhaps it was too much to expect that a crowd of Internet strangers would truly democratize knowledge. Today’s Wikipedia, even with its middling quality and poor representation of the world’s diversity, could be the best encyclopedia we will get.