Facebook Manipulated 689003 Users' Emotions For Science

On Facebook, you may be a guinea pig and not know it.

Facebook is the best human research lab ever. There's no need to get experiment participants to sign pesky consent forms as they've already agreed to the site's data use policy. The site has a team of data scientists that are constantly coming up with new ways to turn users into guinea pigs. When the team releases papers about what it's learned from us, we often learn surprising things about Facebook - such as the fact that it can keep track of the status updates we never actually post. Facebook has played around with manipulating people before - getting 60,000 to rock the vote in 2012 that otherwise wouldn't have - but a recent study shows Facebook playing a whole new level of mind gamery with its guinea pigs users. As first noted by Animal New York, Facebook's data scientists manipulated the News Feeds of 689,003 users, filling them with either positive posts or negative posts or posts devoid of sentiment in order to see how it affected their moods.


They found that emotions were contagious. 'When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,' according to the paper published by the Facebook research team in the PNAS. 'These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.'


The experiment ran for a week - January 11-18, 2012 - during which the hundreds of thousands of Facebook users unknowingly participating may have felt either happier or more depressed than usual, as they saw either more of their friends posting '15 Photos That Restore Our Faith In Humanity' articles or more of their friends' despondent status updates about lost jobs, dead dogs and falling off the New Year's Resolution bandwagon. '*Probably* nobody was driven to suicide,' tweeted one professor linking to the study, adding a '#jokingnotjoking' hashtag.


The researchers - who may not have been thinking about the optics of a 'Facebook emotionally manipulates users' study - jauntily note that the study undermines people who claim that looking at our friends' good lives on Facebook makes us feel depressed. 'The fact that people were more emotionally positive in response to positive emotion updates from their friends stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively,' they write.


They also note that when they took all of the emotional posts out of a person's News Feed, that person became 'less expressive,' i.e. wrote less status updates. So prepare to have Facebook curate your feed with the most emotional of your friends' posts if they feel you're not posting often enough.


So is it okay for Facebook to play mind games with us for science? It's a cool finding but unknowingly manipulating users' emotional states to get there puts Facebook's big toe across that creepy line. When universities run studies on people like this, they have to run them by an ethics board first to get approval. At Facebook, not so much. As a 2012 profile of the Facebook data team noted, ' Unlike academic social scientists, Facebook's employees have a short path from an idea to an experiment on hundreds of millions of people.'


Comments

Popular posts from this blog

5 Reasons iPhone 6 Won't Be Popular

Eset nod32 ativirus 6 free usernames and passwords

Apple's self