Facebook is turning to its users to help out its third-party fact-checking partners.
Product manager Henry Silverman said in a Newsroom post that the new pilot program will help its fact-checkers identify false content more efficiently with the help of a representative group of users determining whether they find a claim to be corroborated or contradicted.
The social network teamed up with global public opinion and data company YouGov to ensure that its pool of community reviewers represents the community of Facebook users as a whole.
The pilot program is running in the U.S., and Silverman said YouGov determined that the process of selecting community reviewers led to a pool that successfully represented Facebook’s community in the U.S., factoring in diverse viewpoints on topics such as political ideology.
YouGov also found that judgments of corroborating claims by community reviewers were consistent with what most people using Facebook would conclude.
Community reviewers are not Facebook employees: They are hired as contractors via one of the company’s partners.
They also do not make final decisions on content: Their findings are shared with the third-party fact-checking partners as additional context for their official reviews.
Silverman explained, “For example, if there is a post claiming that a celebrity has died and community reviewers don’t find any other sources reporting that news—or see a report that the same celebrity is performing later that day—they can flag that the claim isn’t corroborated. Fact-checkers will then see this information as they review and rate the post.”
He also provided an overview of how the process works.
Facebook’s machine learning model identifies potential misinformation via a variety of signals, including comments expressing disbelief or content being shared by pages that have spread misinformation in the past.
Flagged content will be sent to a diverse group of community reviewers, who will be asked to identify the main claim in the post and research whether other sources support or refute that claim.
The social network’s third-party fact-checkers will then see community reviewers’ collective assessment to help determine which content to review and rate.
Silverman said the pilot will continue in the U.S. over the coming months, with Facebook evaluating its progress via its own research and with help from academics and feedback from its third-party fact-checking partners.
He wrote, “We believe that by combining the expertise of third-party fact-checkers with a group of community-based reviewers, we can evaluate misinformation faster and make even more progress reducing its prevalence on Facebook.”