June 30, 2014
Facebook has sparked anger with the news that it manipulated the content that hundreds of thousands of users viewed in an attempt to alter their moods and see what kinds of posts would result, CNN reported.
For one week in early 2012, Facebook changed the news feeds of nearly 690,000 users, showing some people a higher number of positive posts, and showing others more negative posts.
This month, the journal Proceedings of the National Academy of Science published the results of the experiment, conducted by researchers from Cornell, the University of California, San Francisco and Facebook. Users who saw more negative content were slightly more likely to produce negative posts, while those viewing more positive content responded with more upbeat posts. The findings have major implications given the size and scale of the social network, the researchers said.
Facebook’s terms of service give the company legal permission to conduct this kind of research, but ethical questions remain. Many users say that it was a dangerous social experiment.
As CNN reported, there isn’t any indication that Facebook asked those involved whether they would like to participate in the study, or if the users knew that Facebook was using them as guinea pigs.
Facebook frequently modifies the mix of news, personal stories and advertisements that users see. And Susan Fiske, the Princeton professor who edited the research, said, “I understand why people have concerns.” — Greg Beaubien
Broaden your skill set with access to an extensive library of live and on-demand professional development webinars — one of PRSA's premier member benefits.