The 2012 study run by researchers from Cornell University and the University of California in conjunction with Facebook, manipulated almost 700 000 user feeds in a bid to see whether negative or positive content could be used to alter the mood of the user. According to the paper now published in the journal, Proceedings of the National Academy of Sciences, researchers found that: “emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
More simply put, Facebook were checking to see if lots of happy posts from your friends make you feel more or less happy. To test this out, our friends at the social network deliberately spiked news feeds with only negative or only positive posts. They then measured how this affected user responses to the artificial news feeds. Seems that the more positive the posts in your newsfeed the happier you will feel and post- and vice versa.
As far as human psychology is concerned this is nothing new. It’s been long known that people’s moods are directly affected by their environment. This is however the first time anyone has shown that Facebook news feeds can alter our moods. Realistically that finding is no surprise either as our moods and emotions are a product of the environment and for many of us Facebook is part of our emotional environment.
When the story broke, the ensuing outrage was equally as predictable. The study did not have the usual ethics approval because Facebook is not an academic institution. Unethical! The users were not informed of their participation. Immoral! Facebook manipulating people’s emotions; deceitful, evil, creepy. Oh the humanity!
Meanwhile Facebook remained silent (as usual), leaving it to one of the research team Adam Kramer, to defend the study as “consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook.” Kramer also said, “The experiment in question was run in early 2012, and we have come a long way since then.” Thanks Adam, this could be reasonably interpreted as an admission that Facebook is still experimenting with news feeds right now. If they have learnt anything from this study, they won’t publish the results next time.
However, the most interesting result of this study is (as always), the typical outpouring of online anger directed at Facebook whenever the public feels they overstep the mark. Those of us that sign up to Facebook choose to give our personal information and share our content freely in return for the service. It should come as no surprise to any of us that the data we provide and the content we see in the Facebook service is processed and manipulated. We see what Facebook want us to see in our news feeds, in exactly the same way Google searches only reveal what they want us to find. This is standard practice across the entire online environment. So what’s the beef?
Perhaps the next study Facebook participates in should explore just how quickly online anger and indignation spreads when users feel that they have been exploited. It happens often enough to be of widespread interest to the public and outraged users seem to like it when their opinions are recorded, so no ethics will be required. But best of all, they could get a follow up paper showing that when Facebook controversy promotes “massive-scale emotional contagion via social networks”, social media users can turn very nasty, very quickly.