The Laboratorium (3d ser.)

A blog by James Grimmelmann

Soyez réglé dans votre vie et ordinaire afin
d'être violent et original dans vos oeuvres.

Ethical Culture Clashes in Social Media Ressearch

I’m working on an article about the Facebook and OkCupid experiments. This is an outtake: some thoughts, which don’t quite fit in the article itself, about why the stories went viral.

One reason the Facebook and OkCupid experiments sparked such controversy is that they exposed the divergent ways that different communities think about ethics and about user experience design. Academic researchers design experiences for research participants; companies design experiences for their users; users don’t always think how how their experiences are consciously designed. Multiple communities suddenly found themselves in an unexpected dialogue with each other.

Design practices can vary along at least three relevant axes:

  • First, there is a distinction between observation, manipulation, and experimentation. Observation is passive and retrospective; manipulation is active and prospective; experimentation consists of manipulation followed by observation of the results. Observation implicates user privacy; manipulation implicates broader concerns like user autonomy; experimentation implicates both.

  • Second, some practices are designed to achieve a user-experience goal directly. Others are designed to achieve a goal indirectly by gaining useful knowledge for future use. When Google changes its algorithm to demote spam results, its goal is direct: to deliver better results to users now. When Google tests its algorithm to see whether a spam-demoting tweak works, its goal is indirect: to deliver better results to users in the future. Manipulation is direct; observation and experimentation are indirect.

  • Third, practices might be have different intended beneficiaries: users, companies, and society. Directly serving users means giving them the most relevant search results, the most interesting News Feed stories, the most compatible matches, and so on. Indirectly serving them is a matter of observing and experimenting on them to improve their search results, News Feed stories, matches, and so on. Directly, companies might want to show ads, nudge users to buy expensive items, keep users from leaving their sites, and so on. Indirectly, they might want to do any of these more effectively in the future–or they might want to gather valuable data they can sell or use to promote themselves. As for society, when Facebook encourages its users to vote, or when a search engine directs users to opposing viewpoints, it acts from a sense that doing so directly furthers civic goals. When Facebook experimented on emotional contagion, it acted from a sense that doing so indirectly helped society by developing knowledge.

Different ethical arguments cut through this three-dimensional arrangement in different directions. They are not always attentive to the fact that they are taking a one-dimensional slice through a three-dimensional space. Sometimes two arguments that seem to be opposed are in fact coming at the same point at from different angles.

Start with the controversy over the Facebook emotional contagion study. In the press, and for many commenters, the initial source of outrage was that Facebook was experimenting on users rather than simply delivering them them the usual News Feed. On this view, the line between passive observation and active manipulation is ethically salient. This line is legally salient under the Common Rule, as well–it is one of the two routes by which “research” comes to involve “human subjects”–but for different reasons. The Common Rule kicks in when researchers interact with participants because that is the moment at which the researchers could be said to create risks for participants. For many observers of Facebook, the problem was not the incremental risk of the emotional manipulations (primarily a consequentialist concern) but the manipulation itself (primarily a concern about dignity, autonomy, and deception).

This reaction struck many from industry as bizarre, because the News Feed is already extensively tailored for individual users. A News Feed tinkered with to test emotional contagion is not different in kind from a News Feed tinkered with for any of the hundreds of other reasons Facebook tinkers with News Feeds.

Partly the disagreement over News Feed manipulation is simply a factual question about whether users appreciate the extent of the artifice that goes into social media. Some users were shocked to discover how Facebook and other websites work; some from Silicon Valley were shocked that users didn’t already know. News of the emotional contagion study laid bare this chasm of different assumptions, to the surprise of those on both sides.

But this disagreement may also have to do with different stories about the purposes of websites’ experiments. Social media companies relentlessly promote a narrative in which they exist to serve users; any manipulations of their algorithms are designed to improve the user experience. This narrative fits naturally with a baseline of manipulations designed to serve users directly; it fits less easily with experimentation designed to serve users indirectly. OkCupid set up some of its users with deliberately bad matches, a move that seems to make a mockery of its claims to help users find love.

In response, Christian Rudder argued that OkCupid’s mismatching experiment did indeed benefit users, but indirectly rather than directly, by validating the matching algorithm. Thus it helped users in general even if some particular users were mismatched. This tradeoff between direct and indirect user benefit is familiar from medical ethics: it pits a deontological commitment to specific patients against the utilitarian good of patients in general. Informed consent helps mediate the tension by respecting individuals as moral agents while asking them to accept risks on behalf of other moral agents. Rudder missed the point that the moral interests of individual users and the moral interests of users in general are not the same kinds of interests, but he did still articulate a justification for the experiment in terms of benefits to users.

Facebook’s emotional contagion experiment is a little different, because the argument for indirect user benefit is harder to make. Indeed, one reason the Facebook experiment may have gone viral is precisely that it is so hard to square with the narrative that Facebook was simply trying to improve users’ News Feeds.

First and paradoxically, the resulting academic paper’s arid description of the goals of the research divorced the study from any immediate application to improving the News Feed. It read too much like an abstract question of pure research for the indirect-user-benefit story to fully cohere. Narratively, the public benefit of the research may come at the expense of the perceived user benefit. This distinction–between indirect user interests and indirect societal interests–is also ethically salient under the Common Rule, but for very different reasons. There, on one view, it is the dividing line between local, unregulated quality improvement and generalizable, regulated research.

Second, the underlying manipulation (hiding emotional content) is not one that Facebook could make while claiming to fully respect users’ autonomy–even if it honestly thought a happier (or sadder) News Feed would be good for them. Some users were squicked out that Facebook tweaked the emotional balance of their News Feed without telling them. Even if Facebook extensively manipulates the News Feed, and even if Facebook has emotional effects, manipulating the News Feed to produce emotional effects combines the two in a more troubling way. Thus, the observation/manipulation line might be ethically salient when dealing with emotional content, even if it is not ethically salient in general.

Another reason why Facebook’s and OkCupid’s defenses may have rung false with some observers is a suspicion that the purported public benefit was really a smokescreen for corporate self-interest. Facebook has a long and fraught history of redesigning its interface, its privacy controls, and its News Feed algorithm in ways that have ambiguous advantages for users but clear advantages for Facebook. The emotional contagion experiment may therefore have been uniquely alarming precisely because it was conducted by Facebook. The idea that Facebook might push and prod at your emotions to sell you things or to keep you docile and using Facebook is scarily plausible for some users. On this view, the experiment was unsettling not because it was a radical departure from Facebook’s past practices, but because it was so uncannily consistent with them. Compare Facebook’s experiment in promoting voter turnout. While a few observers were alarmed because it suggested Facebook could swing elections, the overall response was far more muted, perhaps because the civic-good story fit the facts so cleanly.

If Facebook’s experiment raised concerns about corporate self-interest because of the perceived uses to which the knowledge it produced might be put, OkCupid’s raised them because of the lack of perceived uses. Rudder’s jocular style makes his writing interesting, but it also suggests that the underlying science may be a joke, too. Given the deliberately provocative tone of Rudder’s blog post and the imminent release of his book, some observers asked whether the entire thing was a put-up.

Research ethics has long dealt with conflicts of interest between researcher and participant. Doctors who are paid to sign up patients for trials may put their commercial interests ahead of patients’ welfare. And researchers who depend on splashy results for prestigious publications and generous grants may pick provocation over research quality. The Facebook and OkCupid versions of these conflicts are a little different. Facebook has an interest in research that refutes suggestions that using Facebook is bad for you; OkCupid has an interest in research that makes for amusing blog posts. And companies themselves have ethical reputations. Hearing that a research project comes from Uber is scarier than hearing that it comes from Tesla, even if the studies are otherwise exactly the same.

There’s no one way to think about social media, or about research. Part of the surprise of the Facebook and OkCupid experiments was the surprise of realizing that other people saw them so differently.

research ethics