The blogosphere is abuzz about Facebook’s antisocial experiment where almost 700,000 people had their feeds secretly altered to gauge effect on emotion.
Although this particular “experiment” lasted only a week back in 2012, Facebook can’t claim it was a small sample.
In a highfalutin-sounding study published by PNAS titled “Experimental evidence of massive-scale emotional contagion through social networks,” chief author and Facebook’s data scientist Adam Kramer called it a “massive (N = 689,003) experiment on Facebook.”
This study recently came to attention via a blog on animalnewyork.com which complained that “Facebook is using us as lab rats.” The story then caught fire after it was picked up by major publications such as WSJ.
More specifically, the study says the following:
“Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging. One such test is reported in this study: A test of whether posts with emotional content are more engaging”.
In other words, Facebook routinely and systematically manipulates the news feed to suit its own purpose (read “more relevant and engaging”) and so this was really no big deal!
According to The Atlantic, the study even creeped out Susan Fiske, who was the editor of the study and is a psychology professor at Princeton University. She wondered in the interview:
“Who knows what other research they’re doing.”
The study further explains what the researchers did in this instance:
“The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed. This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion.”
With all the outrage, Adam Kramer published a public response on Facebook:
“I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
In other words, we would have been better off had we kept this study secret rather than sharing it with the world!
I doubt that Facebook’s emotional manipulation actually scarred any of the unwitting participants, but Facebook’s stance and so-called “internal review practices” may seem two-faced, particularly in light of Mark Zuckerberg’s open letter to President Obama earlier this year about the sanctity of the internet.
“The US government should be the champion for the internet, not a threat. They need to be much more transparent about what they’re doing, or otherwise people will believe the worst.”
Well, touché, Mr. Zuckerberg, touché.
At least the NSA has a good reason to be secretive and a nobler goal of protecting Americans from the terrorists. What this little “research” has taught us is that we need to protect ourselves from the likes of Facebook.