People were predictably angry to find out that Facebook ran an experiment subtly manipulating the emotional content in the news feeds of selected users. The idea of the 2012 experiment was to determine how reactions change if Facebook users saw more happy posts or more negative ones, prompted by the apparent concern that those who saw too many positive posts might feel bad about their own barren social lives. The researchers found the opposite was true.
There are legitimate questions about the ethics of Facebook’s actions. Scientists aren’t supposed to experiment on people without getting permission. “If you are exposing people to something that causes changes in psychological status, that’s experimentation,” James Grimmelmann, a professor of technology and law at the University of Maryland, told Slate.
Adam Kramer, a data scientist at Facbeook, apologized for the anxiety caused by the research. “And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it—the result was that people produced an average of one fewer emotional word, per thousand words, over the following week,” he wrote in a post on Facebook on Sunday.
The kerfuffle exists only because Facebook published a paper about the research in a scientific journal. By scientific standards, however, pretty much all of social media is an unethical experiment. After all, Facebook doesn’t claim to show its users an unfiltered version of what their friends are posting. It uses some mysterious cocktail of factors to decide what mix of of content is most likely to keep people on the site longer. The goal, always, is to increase the number of ads seen by users. Just about every decision made by Facebook is based on what will make money for Facebook. This has been good business but wouldn’t pass muster as science.
The main difference between Facebook’s business model and social-science research is that the emotional manipulation is a side effect rather than the primary goal. Take, for example, its struggle to take control of the news feed late last year. In that case, Facebook found its algorithms were too friendly to popular, inane news being posted by sites like Upworthy and Buzzfeed. Many people within the company wanted to adjust its filters to reward more serious news. Given that aspiring viral content tends to be positive and the news is usually a downer, Facebook would essentially running the same experiment by putting its finger on the scale.
The same could be said for any personalization in social-media feeds and search results. These are simply not neutral services. Realizing this is a basic part of web literacy. But maybe Facebook could turn its emotional filters into a feature. It has all it needs to make a toggle button, letting people choose whether they want the grumpy version of the site, or the sappy one.