If you do have a Facebook account, or use a similar service, did you read the terms of service they have before you hit the “Sign up” button? That’s what I figured, me neither. As a result, you, me, your friends, or possibly even one of your parents, could have been included in an unwanted social experiment that Facebook and their scientists/engineers decided to conduct on 700,000 of its users during 2012. This was conducted without users even knowing that it was happening. No emails were sent, no notifications appeared on the site, it was just turned on (and later off) like a switch in the backend of Facebook. Facebook decided to experiment on its users and manipulated their news feeds in order to study “emotional contagion through social networks”. This basically means that Facebook used its popularity and massive user base to experiment on how people feel and seeing if it would be possible to change a user’s mindset.
It also brings to the mind a bit of a more morbid thought. Since Facebook decided to show some people a sadder version of their newsfeed, did someone who was possibly depressed at the time see all of these negative posts and then commit suicide. We may never actually know the answer to this, and one would hope that the answer is none, but the fact is that if you experiment on unwilling people you may not know the dramatic results that it may have on a person.
In the paper which was published, the only mention of “informed consent” is, “The research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” According to slate.com, “Here is the relevant section of Facebook’s data use policy: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
Coming to the defense of Facebook after the eruption of its user base was Mike Schroepfer, the Chief Technology Officer of Facebook. On October 2, 2014, Mr. Schroepfer posted a blog post on Facebook’s newsroom that went into detail about the research and the changes that Facebook was going to make in regards to experiments. In it he says, “We’re committed to doing research to make Facebook better, but we want to do it in the most responsible way.” He also says that “Facebook does research in a variety of fields, from systems infrastructure to user experience to artificial intelligence to social science. We do this work to understand what we should build and how we should build it, with the goal of improving the products and services we make available each day.”
I can completely understand how Facebook wants to use all of the information it can get its hands on in order to make the service the best that it can be, but there are ethical limits that should be a given in regards to this type of experimentation. Clearly Facebook was not ready for the backlash it saw after publishing the paper as Mr. Schroepfer states “Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism.”
I believe Facebook now knows how bad it messed up as he also says “It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group on people.”