Editor’s note: In this final post for February’s ‘Openness Edition‘, Rachelle Annechino takes us on a journey with her to the homes of her research participants and asks some really important questions about the wild “foreign languages” (legalese/medical-ese) that supposedly produce “informed consent” and the genesis of our understanding and practice of informed consent, and challenges us to think about how we might redesign informed consent in our own projects.
Today I’m interviewing a couple of people who participate in a free program offered through a local hospital. The program mainly serves older adults who are dealing with a range of health issues, like diabetes, cancer, and arthritis. Many of the participants belong to groups that are affected by health disparities (or “preventable differences in the burden of disease, injury, violence, or opportunities to achieve optimal health that are experienced by socially disadvantaged populations” as defined by the US CDC ).
After hanging out at the hospital for a bit to check out the program, I go to the home of a woman in her 60s who couldn’t come to the hospital today. We talk about the study, its risks and benefits. It’s a small exploratory study, some semi-structured interviews; the hospital IRB gave it an expedited review.
The benefits, I explain, are that this might help improve the program or keep the program going. There aren’t really any direct benefits to you though. We wish we had something to give you to thank you for participating. Basically what we’ll do is just sit here and talk. A risk is that some of the questions could be uncomfortable, but we can skip anything you want. If it’s okay with you, I will record the interview. We won’t put your name on the recording or use your name in reports on the interviews.
We have this standard consent form that the hospital uses, I say. It’s kind of long. We can go over what’s in it together, and please feel free to take as much time as you want to look it over…
Et cetera. As I’m saying this stuff, I’m cautiously drawing out the consent form.
Which is eight pages long.
It’s written in some sort of academic/medical dialect of English. There are three separate places for the participant to sign.
The first page is about being an experimental research subject, and is full of language about “drugs,” “devices” and “medical treatments,” even though we’re not doing a clinical experiment, or anything involving drugs, devices or medical treatments. We’re interviewing people about a class. Nevertheless, everyone is supposed to sign this vaguely terrifying “human experimental research subjects” page separately.
Then there are the seven other pages. If I start to describe them, I’m pretty sure I’ll never stop ranting, so let’s just keep it at: there are SEVEN MORE PAGES. And don’t forget the two more places to sign!
Mmkay. Not gonna rant. But just a few things:
- This hospital wants more research that addresses health disparities, including research that looks at people’s experiences in context. Good for them; they want more than just lip service when it comes to figuring out health inequities.
- IRBs aim to prevent disparities in research practices, because like the Belmont report says, the burdens and benefits of research should be distributed justly , and this IRB takes that mission seriously. Good for them; they want to protect people that have historically been treated really (really, really) unjustly [3, 4, 5].
- Like health disparities, research disparities “are directly related to the historical and current unequal distribution of social, political, economic, and environmental resources. ”
So great, the IRB wants to support research that’s trying to make things better for people who bear the brunt of unequally distributed resources. Even better (?), it’s the IRB’s mission to protect people like the participants in this study.
And yet. Who is being protected by this consent form?
It’s not so well designed for, say, people who aren’t literate in medicalese, or for people who don’t trust institutions (because really!) . Not so well designed for lots of issues that can occur more frequently among marginalized groups.
Weirdly though, the form hasn’t actually been that bad to work with. Other consent forms I’ve used have been better designed from my point of view, but this one still informs people of the procedures that are in place and of the study’s risks and benefits. The thing is, informed consent for these interviews hasn’t really been about forms anyway. It has been about relationships between people, and trust.
One participant says that people come into this community all the time asking questions, and they never give anything back. She says she agreed to do this interview because the person who leads the program is giving something back. Her willingness to share information has little to do with me or my ridiculous consent form, but it is about a kind of openness or mutual exchange.
When I think about the ethics of information sharing, I’m not sure that the other forms and standard operating procedures I’ve encountered have been so different from this one. A lot of ethical practices in qualitative public health research — and perhaps in research with people more generally — are informed by bioethics, even when the language of clinical trials isn’t so obvious, and even when the research isn’t conducted in a traditional academic or institutional setting.
Bioethics tends to prioritize individual privacy and the individual’s right to be informed about what will happen to them (risks and benefits of experimental treatments). “Subjects” should only be identified in this model when there is an overwhelming interest in preventing the mass spread of disease. In contrast, journalistic and historical ethics privilege an audience’s right to be informed about what individuals — especially powerful individuals — are doing . Anthropological ethics, meanwhile, cautions against revealing too much or too little about identity. Some secrets don’t want to be shared; some stories become theft when they are shared without credit.
Doing qualitative or mixed methods research can highlight ways that standard human subjects “protections” aren’t always as straightforward as they may seem. When qualitative data needs to be de-identified, for example, researchers may sometimes go beyond removing pre-identified bits of information like names and addresses. Since qualitative data is less structured than quantitative data, the range of information that may be revealed is less specifiable ahead of time. Thick data is full of little details and bits of context that make up the whole, and sometimes the right combination of a few details — details other than, say, a name or address — can make someone highly identifiable. In this way, qualitative researchers are intimately familiar with the problem of data re-identification that is sometimes ascribed to “big data” .
Questions of identification are complicated by the ways people construct their own identities. In some contexts, people don’t want their stories to be de-linked from their names; they don’t want privacy… not exactly. In other contexts, being able to detach an identity from a story can free people to share information or parts of themselves that they would be afraid to reveal otherwise. Concealing information, in those situations, can paradoxically produce more openness.
New technologies add extra layers to longstanding tensions between openness and privacy. While many ideas about openness and privacy and all the spaces in between in social networks are familiar, the tools we have to negotiate exchanges between people are relatively new. Researchers who are exploring questions about consent and information sharing in relation to technology may also need to re-examine their ethical practices.
Suppose you designed an app to give research participants control over how their data was shared — a sort of Facebook for ethical research, for example. Would that be practical in the contexts you’re working in? What would the privacy settings look like?
How would you redesign information sharing and informed consent in research? Can you Instagram your field notes?
 CDC (2013) Health Disparities – Adolescent and School Health.
 National Institutes of Health (1979). The Belmont report: ethical principles and guidelines for the protection of human subjects of research. Bethesda, Maryland.
 Blue, E. (2009). The Strange Career of Leo Stanley: Remaking Manhood and Medicine at San Quentin State Penitentiary, 1913–1951. Pacific Historical Review, 78(2), 210–241. doi:10.1525/phr.2009.78.2.210
 CDC (2013) Tuskegee Study – Timeline.
 Veatch, R. M. (1997). Medical Ethics. Jones and Bartlett Publishers, Inc.
 Transcript – February 15, 1995. Eleventh Meeting of the Advisory Committee on Human Radiation Experiments (1995), Washington, DC.
 Fairchild, A. L., & Johns, D. M. (2012). Beyond Bioethics: Reckoning With the Public Health Paradigm. American Journal of Public Health, 102(8), 1447–1450. doi:10.2105/AJPH.2012.300661.
 Re-identification (2013) Electronic Privacy Information Center.
 Wang, T. (2012, August 2). Writing Live Fieldnotes: Towards a More Open Ethnography. Ethnography Matters.
Check out other posts from the Openness Edition: Jenna Burrel’s ‘#GoOpenAccess for the Ethnography Matters Community‘, Sarah Kendzior’s ‘On Legitimacy, Place and the Anthropology of the Internet‘, Juliano Spyer’s ‘YouTube “video tags” as an open survey tool‘ and An Xiao Mina et al’s ‘Designing for Stories: Working with Homeless Youth in Boyle Heights‘.