Facebook under fire over emotion experiment
Social network slammed for failing to get users’ consent
Facebook has sparked an ethics debate after it was found to have conducted a study on 689,000 users without asking for their permission.
The experiment was designed to discover what would happen if Facebook hid certain “emotional words” from a user’s homepage. Chiefly, the researchers wanted to discover if this would alter user behaviour in terms of what they posted and ‘liked’.
Conducted with assistance from the University of California and New York’s Cornell University, the study was conducted over the course of one week in 2012. It found that users’ emotions were influenced by the content they viewed, an effect known as ’emotional contagion’.
However, the paper has been condemned in some quarters due to the fact that users were not told of the study and did not have the opportunity to opt out if they so desired.
Legal experts have claimed this breaches ethical guidelines on ‘informed consent’ and could potentially have had a negative effect on participants.
Jim Sheridan MP, who sits on the government’s media select committee branded the results “extraordinarily powerful stuff” and pledged to investigate whether legislation could be brought in to prevent people being “thought-controlled”.
In January, Facebook was also accused of monitoring users’ private messaging data by two of its users, who are attempting to secure $10,000 for all those affected, or $100 per user, per day that their privacy is allegedly being violated.
Facebook researchers defend work
In response to the furore surrounding the project, report co-author, Adam Kramer, defended the research, saying it was intended to see whether viewing positive content from their friends made people feel sad or left out, and whether exposure to negative content would make them less likely to use Facebook.
Mr Kramer added that the research was conducted “because we care about the emotional impact of Facebook and the people that use our product”.
He went on to apologise for any “anxiety” that had been caused, and admitted that the negative reaction to the study meant that the end probably did not justify the means.
Facebook also addressed the response to the study, pointing out that none of the data used was associated with one particular account.
Rob Bromilow, operations director at theEword, said: “It’s no secret how much people value their privacy online, so it’s somewhat surprising that Facebook and its researchers weren’t prepared for a reaction like this. It seems that the study’s findings have been lost amid the controversy, and the opportunity to discuss them has probably been lost.”