Skip to main content

Was the Facebook experiment right to "mess with people's minds?"

The revelations about Facebook's emotional experiment with users' newsfeeds back in 2012 has seen the social network fighting off a torrent of criticism. Users were upset to learn that the content of their newsfeed may have been manipulated as researchers tried to determine the effects exposure to positive and negative newsfeed content had on users' subsequent output. Now an official complaint has been lodged against the social network by thee Electronic Privacy Information Center. Epic filed a complaint with the US Federal Trade Commission, alleging that "the company purposefully messed with people’s minds".

There are several lines of attack in the complaint, but the main thrust is that Facebook neither obtained permission from the 700,000 affected users, nor informed them about what was happening. Epic also complains that Facebook failed to warn users that their data would be shared with researchers at Cornell University and the University of California. The complaint points out that "at the time of the experiment, Facebook was subject to a consent order with the Federal Trade Commission which required the company to obtain users' affirmative express consent prior to sharing user information with third parties".

Read more: 10 years on: Is Facebook now a dying social network?

Despite the level of interest the experiment has garnered, the complaint does not make many demands - just that Facebook should be forced to make publicly available the algorithm used to generate newsfeeds, and also "provide such other relief as the Commission finds necessary and appropriate". The suggestion is that Facebook and the research teams not only failed to follow ethical protocols, but also ignored Facebook's own Data Use Policy. The policy that was in place at the time of the experiment stated that "We use the information we receive about you in connection with the services and features we provide to you and other users like your friends, the advertisers that purchase ads on the site, and the developers that build the games, applications, and websites you use". It goes on to say that no data will be shared without first obtaining permission.

Epic notes that Facebook changed its policies after the experiment was conducted in January 2012, to allow for user data to be used for "data analysis, testing, research and service improvement" - a fairly broad terms which could cover the experiment, or variant thereon in future. Facebook and the researchers ended up analysing over three million posts, although the fact that computers rather than human anlayzers were used was seen as keeping the research in line with Facebook policies.

Read more: Positive or negative Facebook posts have a domino effect on users

Epic's complaint makes reference to the many, many vocal opponents to Facebook's actions, citing numerous forum comments and Twitter posts. Some highlights include a tweet from @kissane_sxsw who said "Get off Facebook. Get your family off Facebook. If you work there, quit. They're f*cking awful," and @RalphPici who tweeted "It's sad in a pathetic way that @facebook doesn't get that its [sic] about trust and that they betrayed user trust #FacebookExperiment."

Facebook is yet to respond to Epic's filing of a complaint, but COO Sheryl Sandberg offered a sideways apology about the study on Thursday, saying "We never meant to upset you. It was poorly communicated," she said. "And for that communication we apologize."

What outcome would be acceptable to you? Can Facebook do or say anything to put right what it has done wrong?

Image Credit: Sarawut Aiemsinsuk / Shutterstock