The plan: Facebook has announced it will hire part-time contractors to fast-track posts for fact-checking, as part of a pilot program in the US over the coming months. The idea is that this will let Facebook’s existing fact-checkers find and debunk false claims more quickly.
How it’s meant to work: Facebook’s machine-learning system spots potential misinformation using various signals. For example, comments on the post might express disbelief, or the page sharing it might have a history of sharing incorrect information. These posts will be flagged for this new group of contractors (Facebook calls them “community reviewers”), who will do some research to find other sources to either support or debunk the claim. For example, if a post says a celebrity has died, they can check whether any reliable news sources have reported the story. Their conclusions will then be shared with Facebook’s fact-checkers, in an effort to signal which stories need to be reviewed and rated most urgently.
The problems: It makes sense to try to expedite the fact-checking process, but Facebook will be relying on outsourced, low-paid, part-time amateurs rather than hiring expert reviewers. Somewhat bafflingly, it’s promised to pick a pool of people who are representative of Facebook users in the US, rather than US residents in general, as if objective truth is a question of demographic balance.
The bigger picture: Facebook’s existing fact-checking program is deeply flawed. It’s fully outsourced to third parties (newspapers, think tanks, and other organizations accredited to do this sort of work) and riddled with contradictions. It exists in only some of the countries Facebook works in, and often relies on just one organization, meaning the company has no fact-checking program at all if that group pulls out (as was the case in the Netherlands last month).
No bias here: Facebook thinks that letting third parties fact-check on its behalf allows it to wash its hands of any claims of bias, but the company still picks and chooses the policies that they follow. For example, Facebook makes an exemption for political ads, which is in itself an editorial judgment. Facebook won’t admit that, though, as then it would be open to claims that it is a publisher, and thus liable for content posted on the platform. The new policy is unlikely to satisfy those who say the company is doing too little to stop the spread of misinformation.
Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.