Alan C. Elms, 1975
C.D. Herrera, 1997
Adam Kramer, 2014
Now that the initial heat has faded, it is a good time to place the Facebook experiment in historical perspective. In the first two quotes above, social psychologist Alan C. Elms and philospher-ethicist C.D. Herrera represent two sides of a debate over the ethics and efficacy of relying on deception in experimental research. I highlight these two quotes because they demonstrate moments within social psychology, even if they are a generation apart, when deception surfaces as a topic for reconsideration. Elms, one of the original research assistants involved in Stanley Milgram’s obedience research, writes as deception is being called into question. Herrera, writing with the benefit of hindsight, suggests that paradigms other than behaviorist are the way forward. The crux of this disagreement lies in the conceptualization of the research subject. Is the research subject a reflexive being with an intelligence on par with the researcher’s intelligence, or is the research subject a raw material to be deceived and manipulated by the superior intelligence of the researcher?
Unseen, but looming in the background of this disagreement, is the Industrial Psychology/Human Relations approach, which developed in the 1920’s and 1930’s through the work of researchers like Elton Mayo and his consociates, and in experiments such as those at the Hawthorne plant.
This debate is worth revisiting in light of the Facebook experiment and its fallout. Any understanding of the Facebook experiment — and the kind of experimentation allowed by Big Data more generally — must include the long, intertwined history of behaviorism and experimental deception as it has been refracted through both Adam Kramer’s home discipline of social psychology and somewhat through his adopted discipline of “data scientist” [1].
Disciplining Deception
In a 1975 article, Alan C. Elms laments “The Crisis of Confidence in Social Psychology” in language familiar to anyone acquainted with anthropological history. One of the issues that Elms notes, “Researcher’s Difficulties,” is the tendency for subjects familiar with social psychology to change their behavior when enrolled in an experiment. Another issue, “Outside Influences,” directly addresses the role of IRBs in policing the use of deception in experiments. For Elms, the less known about an experiment to those subject to it, the more valid the findings and the easier the experiment is to conduct. On top of the threat from public knowledge of researchers’ activities, Elms was also concerned about the growing tendency of IRBs to judge social psychology ethics through medical or “antivivisectionist” (i.e., doing nothing as a default response to navigating competing ethical interests) glasses. Elms believes IRB review could potentially render the social experiment ineffective. He ends his paper with a call for the social psychologist to become his own therapist and to move past this crisis of “self-confidence” by proceeding with work in social psychology as it had been traditionally undertaken, free of excessive soul-searching over the ethics of deception. At one point he floats the thought balloon, which continues to be heard every so often, that stopping research to reset the ethics of experimentation would carry a higher ethical risk than continuing on in the traditional manner.
But Herrera (1997) takes a dim view of the use and history of deception in social psychology. For Herrera, reliance on deception holds the discipline back and prevents the development of new approaches. Further, he argues, the benefit/harm balancing act often invoked in defense of deception elides underlying problems with the behaviorist paradigm. Herrera invokes the political arrangements demanded by behaviorism’s strict separation of subject and object as an example:
Enforcing the behavioristic separation of roles required a political, as well as scientific, reorientation. The behavioristic model calls for a revised view of human subjects and the power the psychologist could exert against them. Subjects became objects only as externally observable phenomena.
The debate between Elms and Herrera takes place within the discipline (in both descriptive and moral terms) of social psychology, and their arguments address the past, present, and potential futures of that discipline. Importantly, their arguments are situated in the context of an ongoing tradition in which the process of research is attended to as closely as the product of research. Outside of disciplinary conversations, the subtle interplay between process and product in research is often jettisoned in the name of efficiency.
Deception in Undisciplined Venues

"Cow female black white" by Keith Weller/USDA - www.ars.usda.gov: Image Number K5176-3.
Daniel Bell, in his 1947 article “The Study of Man: Adjusting Men to Machines,” criticizes researchers within the Industrial Psychology/Human Relations approach for uncritically accepting the charge of human engineering handed to them by their industrial employers. In Bell’s view this led to two grave errors: First, it encouraged turning a blind eye to the fate of those who are subject to their experiments. Second, it allowed researchers to ignore the conceptual challenges that inevitably accompany all research. The result of this research, Bell argued, was to cede the possibility that one aim of social science “may be to explore alternative (and better, i.e., more human) modes of human combinations, not merely to make more effective those that already exist.” Bell summed up his objections by noting that the Human Relations approach studied how to fit people to the machine rather than how the machine fits people; concluding that the Human Relations approach was “not a science of man, but a cow-sociology.”
In Rebecca Lemov’s wonderful book “World as Laboratory: Experiments with Mice, Mazes and Men,” she floats a thought experiment: What if the true founder of the social sciences in America were not any of the usual suspects, but rather Beardsley Ruml. Ruml, Lemov explains, regarded the social sciences as a combination of experimental science and social engineering, leveraging his talent for communicating (communing might be the better word) with industry leaders in a position to fund the social sciences. From his position at the Laura Spelman Rockefeller Memorial Fund, Ruml poured money into the behavioral social sciences through both individual grants to scholars and large block grants administered, in large part, through the SSRC. In this way, Ruml was instrumental in nurturing the nascent assemblage of industrial interest in efficient production with academic interest in human behavior that culminated in the transformation of the Relay Assembly Test Room at the Hawthorne plant into a Skinner box.
While Ruml may not be a widely known figure, he remains an important starting point for any comprehensive understanding of the newly emerging cohort of data scientists like Adam Kramer, and their benefactors at Facebook, Google, Apple, Palantir, etc.. Like their predecessors in Industrial Psychology/Human Relations, the kind of naive empiricism data scientists accept comes at a cost to both those inside their Skinner box and the disciplines they draw from. Herrera identified the crux of this problem when he described the political relations at the heart of the behaviorist paradigm. Are those subjected to experimental research simply an “externally observable phenomena” to be kept calm (free of anxiety which might affect the experiment) and manipulated at the whim of the researcher, or are they human beings to whom participation in research must be justified and from whom consented must be received? Adam Kramer’s response to critics of his experiment speaks volumes:
Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.
The era of Big Data, truly, is a new golden age of “cow sociology.”
Epilogue
While this post was under review, OKCupid announced that they happily experiment on people. While there have been many reactions, pro and con, over the last week, I would like to call your attention to an interview Christian Rudder gave. In particular, consider his answer to the question quoted below. In the previous question Rudder had described the experiments run by OKCupid as “just part of the scientific method.”
AG: Was there any consideration given to an opt-in procedure where people could, beforehand, be part it and then just having a control group?
CR: No, there wasn’t. Once people know that they’re being studied along a particular axis, inevitably they’re gonna act differently. Just the same way that people on reality TV don’t act like themselves. Like I was in some psych experiments when I was in college, just ’cause they give you twenty bucks to go to the department and you, y’know, you sign a form. But that is informed consent — which users can’t see but I’m putting in quotes — and you uh, y’know you sit down and you hit a button when some word blinks on the screen or a dot appears and you like move a lever or whatever, and you have no idea what they’re measuring you for. Y’know they don’t tell you anything, they could just be measuring whether you’re obeying their instructions or how you greeted the person of another race at the very beginning of the whole thing and the experiment is just a sham. So like, you’re not really informed.
Clearly, Rudder misses the point of the informed consent process. More importantly, Rudder invokes the power, and ubiquity, of the scientific method when it suits him, while denying any responsibility for wielding that power when negative effects are mentioned. If Kramer, at least, has the good sense to quote disciplinary boilerplate about causing “anxiety” and fall back on his academic training, under hard questioning Rudder is left grasping at straws.
Rudder’s clueless responses to the interviewer’s basic questions about the history and ethics of social experimentation illustrate the most concerning aspect of data science. Kramer, we can say, should have known better, but Rudder is “not really informed” about what he is doing. Of course, the rub here is that he doesn’t have to know what he is doing. As the interview makes clear, Rudder delights in his experimental power, so long as it is not accompanied by responsibility for unintended consequences.
- There is no hard and fast criteria for data scientist, but in general the criteria involve the mastery of a programming language, particularly a specialized language for statistics, and a background in a quantitative social science. Here I intend the sort of data science practiced by Adam Kramer, and others working with a similar charge to engineer human realtionships.↩
References
Bell, Daniel
1947 The Study of Man: Adjusting Men to Machines. Commentary (pre-1986), January: 79.
Elms, Alan C.
1975 The Crisis of Confidence in Social Psychology. American Psychologist 30(10): 967.
Herrera, C. D.
1997 A Historical Interpretation of Deceptive Experiments in American Psychology. History of the Human Sciences 10(1): 23–36.
Lemov, Rebecca
2006 World as Laboratory: Experiments with Mice, Mazes, and Men. Macmillan.
While the historical context here is interesting, I think it is a stretch to say that contemporary data science falls prey to the same problems as Skinner’s behaviorism. Data science, dependent as it is on computer programming, falls more naturally within the cognitivist research paradigm.
Cognitive psychology has a long history of experiments involving deception for exactly the reasons Rudder describes–if you are trying to get a scientific understanding of the process of cognition, you need to provide experimental conditions that don’t perturb your results. Building a dating site that assists with the cognitively demanding work of dating is just such an experiment.
There is much more to cognitive psychology since Skinner than ‘cow psychology’ and I think framing the contemporary debate in this way is rhetorically powerful but scientifically unhelpful.