The controversial study was conducted back in 2012 and used Facebook to see if moods can be transferred without face-to-face interaction.
For one week (11- 18 January 2012), nearly 700,000 news feeds were controlled by Facebook in order to reduce either negative or positive posts and then monitored how users responded, according to a published study called “Experimental evidence of massive-scale emotional contagion through social networks.”
Some users were shown content predominately made up of happy and positive words, while some were shown content analyzed as sadder than average. And, at the conclusion of the week, these manipulated users were more likely to post either especially positive or negative words themselves.
The study, conducted by researchers affiliated with Facebook at Cornell University and the University of California at San Francisco, appeared in the 17 June edition of the Proceedings of the National Academy of Sciences.
|
According to the research, when users were exposed to fewer positive posts, their own posts were more negative; when users were exposed to fewer negative posts, their own posts were more positive. The study also found that when users were not exposed to positively or negatively- charged posts, their own posts became less expressive.
The researchers said posts were filtered by the amount of positive or negative words used.
“Posts were determined to be positive or negative if they contained at least one positive or negative word…,” according to the study’s abstract.
Overall, researchers claimed the significance of the study is that it suggests emotions can be transferred without face-to-face interaction, without any nonverbal cues.
"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," the study's authors wrote.
"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."
Despite the backlash over user manipulation, which appears legal and adheres to Facebook's terms of service, Facebook defended the research in a statement, writing: "This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account."
"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.
"A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow.
"We carefully consider what research we do and have a strong internal review process."
In the paper, the researchers said the study "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook".
You can check out the full text here.
Some Facebook users have taken to social media to vent their anger, and even have even called on others to leave Facebook altogether after what some see as a breach of trust.
Get off Facebook. Get your family off Facebook. If you work there, quit. They're fucking awful.
— Erin Kissane (@kissane) June 28, 2014
If you're a libertarian who thinks govt power is the only kind that can be abused & the market will solve everything --> #FacebookExperiment
— Anonymous (@YourAnonNews) June 29, 2014
In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook "transmission of anger" experiment is terrifying.
— Clay Johnson (@cjoh) June 28, 2014
One of the researchers were also concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.
"Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012)," Researcher Adam Kramer wrote on his Facebook page.
"Nobody's posts were "hidden," they just didn't show up on some loads of Feed. Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.
"And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.
"The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.
"While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices.
"The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper."