The Psychological Toll of Moderating Social Media Content

A Wired article explores the difficult work of social media content moderators.


Wired‘s Adrian Chen takes a look at what goes on behind the scenes of social media: legions of content moderators who look at child porn and beheadings so social media users don’t have to. According the report, the number of content moderators is estimated to be over 100,000 — “twice the total head count of Google and nearly 14 times that of Facebook.”

Much of the time, content moderation is outsourced to the Philippines, though tech firms do hire content moderators in the U.S. Chen talked to one former YouTube moderator, who told him: “Everybody hits the wall, generally between three and five months… You just think, ‘Holy shit, what am I spending my day doing? This is awful.’”

Here’s an excerpt of the piece:

Even with the best counseling, staring into the heart of human darkness exacts a toll. Workers quit because they feel desensitized by the hours of pornography they watch each day and no longer want to be intimate with their spouses. Others report a supercharged sex drive. “How would you feel watching pornography for eight hours a day, every day?” Denise [a psychologist] says. “How long can you take that?”

Constant exposure to videos like this has turned some [content moderators] intensely paranoid. Every day they see proof of the infinite variety of human depravity. They begin to suspect the worst of people they meet in real life, wondering what secrets their hard drives might hold. Two of Maria’s female coworkers have become so suspicious that they no longer leave their children with babysitters. They sometimes miss work because they can’t find someone they trust to take care of their kids.

Head over to Wired to read the full story.

Publish date: October 24, 2014 © 2020 Adweek, LLC. - All Rights Reserved and NOT FOR REPRINT