Tag Archives: research methods

Trace ethnography: a retrospective


Stuart GeigerStuart Geiger @staeiou continues our edition of ‘The Person in the (Big) Data‘ with a reflection on his practice of ‘trace ethnography’ that focuses on the trace-making techniques that render users’ activities and intentions legible to each other. Importantly, Stuart argues, we as researchers need to see these traces in the context of our active socialization within the community in question, rather than passively reading traces through lurking. 

When I was an M.A. student back in 2009, I was trying to explain various things about how Wikipedia worked to my then-advisor David Ribes. I had been ethnographically studying the cultures of collaboration in the encyclopedia project, and I had gotten to the point where I could look through the metadata documenting changes to Wikipedia and know quite a bit about the context of whatever activity was taking place. I was able to do this because Wikipedians do this: they leave publicly accessible trace data in particular ways, in order to make their actions and intentions visible to other Wikipedians. However, this was practically illegible to David, who had not done this kind of participant-observation in Wikipedia and had therefore not gained this kind of socio-technical competency. 

For example, if I added “{{db-a7}}” to the top an article, a big red notice would be automatically added to the page, saying that the page has been nominated for “speedy deletion.” Tagging the article in this way would also put it into various information flows where Wikipedia administrators would review it. If any of Wikipedia’s administrators agreed that the article met speedy deletion criteria A7, then they would be empowered to unilaterally delete it without further discussion. If I was not the article’s creator, I could remove the {{db-a7}} trace from the article to take it out of the speedy deletion process, which means the person who nominated it for deletion would have to go through the standard deletion process. However, if I was the article’s creator, it would not be proper for me to remove that tag — and if I did, others would find out and put it back. If someone added the “{{db-a7}}” trace to an article I created, I could add “{{hangon}}” below it in order to inhibit this process a bit — although a hangon is a just a request, it does not prevent an administrator from deleting the article.

File:Wiki Women's Edit-a-thon-1.jpg

Wikipedians at an in-person edit-a-thon (the Women’s History Month edit-a-thon in 2012). However, most of the time, Wikipedians don’t get to do their work sitting right next to each other, which is why they rely extensively on trace data to coordinate render their activities accountable to each other. Photo by Matthew Roth, CC-BY-SA 3.0

I knew all of this both because Wikipedians told me and because this was something I experienced again and again as a participant observer. Wikipedians had documented this documentary practice in many different places on Wikipedia’s meta pages. I had first-hand experience with these trace data, first on the receiving end with one of my own articles. Then later, I became someone who nominated others’ articles for deletion. When I was learning how to participate in the project as a Wikipedian (which I now consider myself to be), I started to use these kinds of trace data practices and conventions to signify my own actions and intentions to others. This made things far easier for me as a Wikipedian, in the same way that learning my university’s arcane budgeting and human resource codes helps me navigate that bureaucracy far easier.Read More… Trace ethnography: a retrospective

The Person in the (Big) Data


FullSizeRender This edition of EM is jam-packed with methods for doing people-centred digital research and is edited by EM co-founder Heather Ford (@hfordsa)
newly-appointed Fellow in Digital Methods at the University of Leeds and thus super excited to understand her role as an ethnographer who (also) does digital methods.

Today we launch the next edition of Ethnography Matters entitled: ‘Methods for uncovering the Person in the (Big) Data’. The aim of the edition is to document some of the innovative methods that are being used to explore online communities, cultures and politics in ways that connect people to the data created about/by them. By ‘method’, we mean both the things that researchers do (interviews, memo-ing, member checking, participant observation) as well as the principles that underpin what many of us do (serving communities, enabling people-centred research, advocating for change). In this introductory post, I outline the current debate around the risks of data-centric research methods and introduce two principles of people-centric research methods that are common to the methods that we’ll be showcasing in the coming weeks.

As researchers involved in studying life in an environment suffused by data, we are all (to at least some extent) asking and answering questions about how we employ digital methods in our research practice. The increasing reliance on natively digital methods is part of what David Berry calls the “computational turn” in the social sciences, and what industry researchers recognize as moves towards Big Data and the rise of Data Science.

digitalmethods

Digital Methods‘ by Richard Rogers (2013)

First, a word on digital methods. In his groundbreaking work on digital methods, Richard Rogers argued for a move towards natively digital methods. In doing so, Rogers distinguishes between methods that have been digitized (e.g. online surveys) vs. those that are “born digital” (e.g. recommender systems), arguing that the Internet should not only be seen as an object for studying online communities but as a source for studying modern life that is now suffused by data. “Digital methods,” writes Rogers, “strives to follow the evolving methods of the medium” by the researcher becoming a “native” speaker of online vocabulary and practices.

The risks of going natively digital

There are, however, risks associated with going native. As ethnographers, we recognize the important critical role that we play of bridging different communities and maintaining reflexivity about our research practice at all times and this makes ethnographers great partners in data studies. Going native in this context, in other words, is an appropriate metaphor for both the benefits and risks of digital methods because the risk is not in using digital methods but in focusing too much on data traces.

Having surveyed some of debates about data-centric methodology, I’ve categorized the risks according to three core themes: 1. accuracy and completeness, 2. access and control, 3. ethical issues.Read More… The Person in the (Big) Data

Ethnomatters’ ‘Openness Edition’


Below is a full list of the posts for our first edition of a monthly collection. Thank you so much to our amazing guest contributors and to contributing editors who helped out!

windows2

‘Open window’ by Sharon Hall Shipp. CC-BY-NC on Flickr

Editorial by Heather Ford, 7 February, 2013

The ethics of openness: How informed is “informed consent”? by Rachelle Annechino, 1 March, 2013

#GoOpenAccess for the Ethnography Matters Community by Jenna Burrell, 27 January, 2013

Designing for Stories: Working with Homeless Youth in Boyle Heights by Jeff Hall, Elizabeth Gin and An Xiao Mina, 27 February, 2013

On Legitimacy, Place and the Anthropology of the Internet by Sarah Kendzior, 13 February, 2013

YouTube “video tags” as an open survey tool by Juliano Spyer, 21 February, 2013