Skip to main content

Facebook lands in hot water after experiment with users’ emotions

Facebook and controversy are never far apart, and the network is now part of another storm in a social teacup over an experiment which manipulated the feeds of close to 700,000 members for a week back in 2012.

Basically, the social network undertook a study in conjunction with two American universities to find "evidence of massive-scale emotional contagion through social networks".

What that means is they manipulated news feeds to adjust the amount of "emotive" content on them, and whether exposure to more negative or positive posts caused contagion – in other words, the reader to post negatively/positively themselves.

As you can imagine, the fact that some folks had their feeds tinkered with, and became social guinea pigs without their consent has led to quite an outpouring of criticism. The BBC reports that Facebook, however, was careful to clarify that there was "no unnecessary collection of people's data", and that "none of the data used was associated with a specific person's Facebook account".

But the fact that the social network can just decide to do such an experiment without feeling the need to seek any form of consent has left many folks feeling uncomfortable, of course.

The Beeb spotted that Kate Crawford, a principal researcher at Microsoft Research who deals with the politics and ethics of data, tweeted: "Let's call the Facebook experiment what it is: a symptom of a much wider failure to think about ethics, power and consent on platforms."

On the reaction of the authors of the study to the controversy which has exploded, she noted: "Researchers regret 'the way the paper described the research and any anxiety it caused,' not lack of consent."

As many folks on Twitter have pointed out, though, this sort of tinkering with feeds is what Facebook does all the time in its quest for monetisation anyway – it just doesn't normally publish a report about it.