Facebook Releases White Paper on the Next Steps in Moderating Online Content

Don’t look for new ideas: Many of the key steps are already in place at the social network

Facebook's white paper poses four questions that are key in the debate over how to regulate online content Tera Vector/iStock

Facebook released a white paper Monday on ideas for regulating online content in the future, but most of the concepts contained within are policies or initiatives that are already in place or in the works at the social network.

Vice president of content policy Monika Bickert introduced the white paper in a Newsroom post Monday, writing, “Over the past decade, the internet has improved economies, reunited families, raised money for charity and helped bring about political change. However, the internet has also made it easier to share harmful content like hate speech and terrorist propaganda. Governments, academics and others are debating how to hold internet platforms accountable, particularly in their efforts to keep people safe and protect fundamental rights like freedom of expression.”

Bickert said the white paper poses four questions that are key in the debate over how to regulate online content.

How can content regulation best achieve the goal of reducing harmful speech while preserving free expression? Facebook suggested systems including user-friendly channels for reporting content and external oversight of policies or enforcement decisions.

Any Facebook user can already report any piece of content and explain why they are reporting it.

And the social network is well into the process of establishing a global independent advisory board for content.

Bickert also suggested “periodic public reporting of enforcement data,” enabling governments and individuals to get a clear picture of social platforms’ efforts.

Facebook already publishes twice-yearly Community Standards Enforcement Reports.

How can regulations enhance the accountability of internet platforms? The social network suggested requirements for companies such as publishing their content standards, consulting with stakeholders prior to significant changes and creating appeal channels for decisions on removing or not removing content.

None of these would place any additional burden on Facebook, which already publishes its community standards and indicates when they are updated, already consults with a Safety Advisory Board and is putting the new global independent advisory board into place to handle appeals.

Should regulation require internet companies to meet certain performance targets? Bickert suggested incentivizing companies to meet specific targets such as keeping the prevalence of violating content below an agreed-upon threshold.

This is already in place at the social network, as well, as Facebook said in its most recent Community Standards Enforcement Report that its rate of removing hate speech proactively was up to 80%, and its rate of detecting and removing content associated with al-Qaida, Isis (Islamic State) and their affiliates remains above 99%, adding that for all terrorist organizations, that figure is 98.5%.

Should regulation define which “harmful content” should be prohibited on the internet? Bickert wrote in the white paper that governments should create rule to address the complexity over the differences between laws restricting speech and internet content moderation, and those laws should recognize user preferences, the variations among internet services and the ability to enforce them at scale, allowing for flexibility across language, trends and context.

Bickert discussed four primary challenges in the white paper, introducing them as follows: “Private internet platforms are facing increasing questions about how accountable and responsible they are for the decisions they make. They hear from users, who want the companies to reduce abuse but not infringe upon freedom of expression. They hear from governments, who want companies to remove not only illegal content, but also legal content that may contribute to harm, but make sure that they are not biased in their adoption or application of rules. Faced with concerns about bullying or child sexual exploitation or terrorist recruitment, for instance, governments may find themselves doubting the effectiveness of a company’s efforts to combat such abuse or, worse, being unable to satisfactorily determine what those efforts are.”

Legal environments and speech norms vary. Bickert pointed out the cross-border nature of social platforms and the fact that most maintain one set of global policies instead of country-specific policies that interfere with the experiences they provide.

Technology and speech are dynamic. Facebook highlighted the differences in platforms in terms of the types of content shared through them, as well as the audience that content is shared with (one-on-one or town square/broadcasting, ephemeral or permanent). Bickert wrote, “Just as people may say things in the privacy of a family dinner conversation that they would not say at a town hall meeting, online communities cultivate their own norms that are enforced both formally and informally. All are constantly changing to compete and succeed.”

Enforcement will always be imperfect. The company said this is true due to the dynamic nature and scale of online speech, as well as the limits of enforcement technology and the different expectations people have, adding that even in a hypothetical world where enforcement efforts could perfectly track language trends and identify likely policy violations, companies would still struggle to apply policies because they lack necessary context.

Companies are intermediaries, not speakers. Facebook said liability laws covering publishers for illegal speech are unsuitable for the internet and social platforms, and it would be “impractical and harmful” to require each post to be approved before it goes live. Bickert wrote, “The ability of individuals to communicate without journalistic intervention has been central to the internet’s growth and positive impact. Imposing traditional publishing liability would seriously curtail that impact by limiting the ability of individuals to speak. Companies seeking assurance that their users’ speech is lawful would need to review and approve each post before allowing it to appear on the site. Companies that could afford to offer services under such a model would be incentivized to restrict service to a limited set of users and err on the side of removing any speech close to legal lines.”

Finally, Bickert and Facebook shared potential guidelines for future regulation.

They wrote that ensuring accountability in content moderation systems and procedures will be the best way to create the incentives for companies to responsibly balance values like safety, privacy and freedom of expression.

Any national regulatory approach to addressing harmful content should respect the internet’s global scale and the value of cross-border communications, and regulators should consider the impacts of their decisions on freedom of expression.

Facebook also said regulators should develop an understanding of the capabilities and limitations of technology in content moderation and give internet companies the flexibility to innovate, as an approach that works for one platform may be less effective or counterproductive on another.

And the social network said regulators must take into account the severity and prevalence of harmful content, as well as its status according to applicable laws and the efforts that are already under way by platforms to address it.

Bickert concluded, “We hope that this paper contributes to the ongoing conversations among policymakers and other stakeholders. These are challenging issues, and sustainable solutions will require collaboration. As we learn more from interacting with experts and refining our approach to content moderation, our thinking about the best ways to regulate will continue to evolve. We welcome feedback on the issues discussed in this paper.”


david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.
Publish date: February 18, 2020 https://stage.adweek.com/digital/facebook-releases-white-paper-on-the-next-steps-in-moderating-online-content/ © 2020 Adweek, LLC. - All Rights Reserved and NOT FOR REPRINT
{"taxonomy":"","sortby":"","label":"","shouldShow":""}