From Facebook tinkered with users’ feeds for a massive psychology experiment
Facebook has recently been the center of a controversy regarding their manipulation of user’s “walls,” an aggregation of friends’ and family’s links, musings and photos. In a recent study published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS), Facebook unveiled that they manipulated 700,000 users’ walls, showing them posts containing more positive or negative words and then monitoring whether those users then posted more positive or negative words themselves.
Amid the controversy surrounding their study, there has been relatively little treatment of the data presented in the paper itself. One striking thing to note is that the effect sizes of their manipulations are extremely small, indicating a very weak effect. The authors counter that given the scale of Facebook, even small effects can be substantive. What is unclear, however, is whether there could even be actual consequences on the individual person level. While there are legitimate concerns regarding the study, such as the failure to inform participants of the manipulations after the fact (see the American Psychological Society’s guidelines on ethical practices for studies of this nature), we should remember just how slightly Facebook’s manipulations likely impacted their unwitting participants.
Edited by SITN Waves Editor Adam Brown. Special thanks to Kyle Dillon, a graduate student in the Harvard Psychology Department for his detailed insight.
For the full article published in PNAS, go to:
Experimental evidence of massive-scale emotional contagion through social networks
For another discussion in the media check out Forbes’ treatment of the study.
One thought on “Facebook’s Manipulation Studies – A Critical Look”
So if Facebook doesn’t inform culprits of their transgressions against its publication rules, how will they know that they have done so – bearing in mind, some of the TRIVIA or, “mole hills” that certain people like to make mountains out of? If Facebook is not informing people who violate their rules, and just banning them without their knowledge, that’s another of the myriad of shortcomings that Facebook is being accused of lately.