Facebook data scientists recently conducted a study testing the impact of negative or positive post biasing in the news feed. They were hoping to find out if a user’s mood could be impacted by the weighting different content. But the aim of the study is no longer the story; the story is that Facebook performed experiments on its user base without their knowledge.
Since the study became common knowledge to Facebook users, Facebook staff have been apologizing profusely. “It was poorly communicated,” said Facebook COO Sheryl Sandberg “And for that communication we apologize. We never meant to upset you.” But Facebook has done a lot of apologizing in the past, and it doesn’t mean the company will do the right thing in the future.
The study itself surveyed mood changes to more than 600,000 users, some of whom may have been minors at the time of the study, so the issue becomes one of consent. Users under the age of 18 would lack the capacity to consent to the study, and most of the users experimented on had no knowledge that any changes were taking place.
While some would argue that implicit consent for this kind of research and study is given when agreeing to the Terms Of Service, Facebook may have altered the Terms of Service after the fact. Forbes points out that the statement “For internal operations, including troubleshooting, data analysis, testing, research and service improvement” was added four months after the study concluded. [emphasis added]
Some commentators have tied the research scientist behind the project, Jeffrey T. Hancock, to funding from the Department of Defense, noting that he has conducted research in the past on how civil unrest starts and propagates on social networks.
Possible connections to the DOD aside, the issue remains that the Facebook data scientists weren’t just testing users reactions — they were intentionally manipulating user moods. Occasional observation of behavior patterns happens on most social sites, but many believe Facebook has crossed a line in more ways than one.
Manipulating users that are unaware that they are being studied sounds pretty unethical. Changing the ToS to retroactively protect your company sounds even worse.
Image credit: zeevveez