Facebook Performed a Psychology Experiment on Thousands of Users Without Telling Them

A Facebook error message is seen in this illustration photo of a computer screen in Singapore June 19, 2014. Thomas White/Reuters

If you're on Facebook, there's a roughly 0.04 percent chance the social media behemoth used you for a psychology experiment in early 2012, though you'd have had no way of knowing at the time and indeed would only be finding out about the experiment this week.

That's what happened when researchers used nearly 700,000 Facebook users as guinea pigs for a study on "emotional contagion." In brief, the study separated its users into two groups. One was subjected to a newsfeed of primarily positive posts; the other was flooded with emotionally negative items.

The results "suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks," the researchers write in a paper now published in the Proceedings of the National Academy of Science. In other words, the study confirmed what heavy Facebook users have long known to be true: what your friends post on Facebook can have a tangible impact on your own emotional state.

Alas, Facebook hasn't been able to manipulate the reactions of users who have learned about the study's existence: people are pissed. Over the weekend, observers blasted the experiment as scandalous and disturbing, and according to The Guardian, one British MP has gone so far as to call for a parliamentary investigation into how it was carried out.

Adam Kramer, Facebook's data scientist and a co-author of the study, took to his own Facebook page to offer a defense.

"The goal of all of our research at Facebook is to learn how to provide a better service," Kramer writes. "Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone." Still, he goes on to concede that those aims weren't made particularly clear and that the research conclusions "may not have justified all of this anxiety." That's probably an understatement.

Most disturbing, though, is the fact that this sort of data manipulation is almost certainly within the bounds of Facebook's Terms of Use. As The Verge points out, every Facebook user consents to their information being used for "internal operations, including troubleshooting, data analysis, testing, research and service improvement."

But Facebook doesn't need to round up another half million study participants to confirm that virtually no one actually reads the Terms of Use.

Update, 10:50 am: Turns out Facebook only added that research clause to its data use policy in May, 2012, Forbes points out. Moreover, the study doesn't seem to have been filtered by age, so minors (users under 18) were made to be guineau pigs in the study.