(by Calli Schroeder, Colorado Law 3L)
Large companies experiment on their users all the time in large and small ways through “product testing.” Changing the format of a homepage or the layout of an app to see if it facilitates easier use or better engagement could constitute an “experiment.” However, what happens when the experiment is only tangentially related to the product? And how does this affect our understanding of privacy and informed consent?
Experiments on Users
When large-scale experiments on users are revealed, the public response is often outrage. The widely publicized Facebook Emotional Contagion Experiment took nearly 700,000 users out of the pool of Facebook Users and tweaked their news feeds to be more positive or negative, testing to see if this would affect the users’ moods. This prompted public outrage, investigations into the legality of the experiment, and an apology from the scientific journal that had published the results.
Other experiments made public (just by Facebook) include experiments on voter turnout, a study predicting the length of users’ relationships, and a paper analyzing what factors predicted an expressed support of marriage equality on social media.
The issues raised in these experiments tend to boil down to two categories: 1) Privacy and 2) Consent.
One privacy concern is the worry that data manipulation spills over into “real life” issues that are supposed to be separate from the data we submit online. Our status updates being used to affect our real-life moods is concerning. A format change (adding an “I voted” banner) being used to increase voter turnout has potentially frightening implications about Facebook’s power and influence.
Another concern is based on the user expectation of privacy. We may all expect that our online activity will be observed, but not that it will be actively manipulated. That distinction plays into a larger debate about what we can and should expect regarding our privacy online.
The consent issue gets complicated quickly. If a Human Subject Research experiment is run by a University, it has to meet the Common Rule standard of informed consent (informing the subject of the risk, of the perimeters of the experiment, etc). The standard for private companies engaged in these experiments appears much lower. Facebook initially argued that its 9,045 word Terms of Service agreement should be considered informed consent. While this does not meet the high standards of informed consent under the Common Rule, organizations that are not federally funded are not bound by the Common Rule. However, there are complications here as well.
James Grimmelmann points out that there are three problems here with informed consent and the Facebook Emotional Contagion Experiment:
- The scientific journal that the study was published in does have an obligation to follow the Common Rule and enforce Common Rule standards on the studies it publishes;
- Facebook’s user agreement does not inform users that they will be used as test subjects; and
- University employees helped design and publish this experiment.
How do we address the problem of separate consent standards in Human Subject Research? Do we give up any expectation of privacy when we join these social media companies as users? What recourse can we turn to if we feel our expectations of privacy are violated?