Tag Archives: quantitative

A case study on inclusive design: ethnography and energy use

Dan_Lockton.width-300Dr. Dan Lockton (@danlockton) is a senior associate at the Helen Hamlyn Centre for Design, at the Royal College of Art in London. Originally a design engineer, he became interested in including people better in design research while working on mobility products. For his PhD at Brunel University, he developed the Design with Intent toolkit, a multidisciplinary collection of design patterns around human behaviour which Tricia blogged about in 2011. Since then, he has worked on a number of domestic and workplace energy-related behaviour change projects, including CarbonCulture and currently SusLab, a large pan-European project. There is a ‘SusLab at the RCA’ blog; this article is based on the paper Dan presented at EPIC 2013.

Editors note: Energy usage and conservation can be a seemingly mundane part of an individual’s daily life on one hand, but a politically, ecologically, and economically critical issue on the other. Despite its importance, there is a startling lack of insight into what guides and influences behaviors surrounding energy. 

With conventional quantitative analyses of properties and income explaining less than 40% of variations in households’ consumption, Dr Dan Lockton (@danlockton) and Flora Bowden set out to unpack some of the behavioral nuances and contextual insights around energy use within the daily lives of British households, from the perspective of design researchers. Their interviews had them meeting everyone from “quantified self” enthusiasts to low-income residents of public housing, and involving them in the design process. What they discovered bears significant implications for design which seeks to influence behaviors around energy, for example, where policy makers and utility companies see households as “using energy”, household members see their own behavior as solving problems and making their homes more comfortable, such as by running a bath to unwind after a trying day, or preparing a meal for their family.

Read on to see what else Dan and Flora learned in their ethnographic research, and how understanding “folk models” of energy – what energy “looks like” – may hold the key to curtailing energy usage.

For more posts from this EPIC edition curated by contributing editor Tricia Wang (who gave the opening keynoted talk at EPIC this year), follow this link.

Gas prepayment card

A householder in Bethnal Green, East London, shows us her gas prepayment card.

It’s rare a day goes by without some exhortation to ‘reduce our energy use’: it’s a major societal and geo-political challenge, encompassing security, social issues and economics as well as environmental considerations. There is a vast array of projects and initiatives, from government, industry and academia all aiming to tackle different aspects of the problem, both technological and behavioural.

However, many approaches, including the UK’s smart metering rollout, largely treat ‘energy demand’ as something fungible—homogeneous even—to be addressed primarily through giving householders pricing-based feedback, with an assumption that they will somehow automatically reduce how much energy they use, in response to seeing the price. There is much less emphasis on understanding why people use energy in the first place—what are they actually doing?Read More… A case study on inclusive design: ethnography and energy use

On Digital Ethnography: mapping as a mode of data discovery (2 of 4)

WendyHsu_pineconeEditor’s Note: Can ethnographers use software programs? Last month’s guest contributor, Wendy Hsu @WendyFHsu, says YES! In Part 1 of On Digital Ethnography, What do computers have to do with ethnography?, Wendy introduced her process of using computer programming software to collect quantitative data in her ethnographic research. She received a lot of great comments and suggestions from readers. 

Part 2 of of Wendy’s Digital Ethnography series focuses on the processing and interpreting part. In fascinating detail, Wendy discusses mapping as a mode of discovery. We learn how using a customized spatial “algorithm that balances point density and readability” can reveal patterns that inform the physical spread of musicians’ fans and friends globally. Geo-location data clarified her qualitative data. We are already in great anticipation for Part 3! 

Check out past posts from guest bloggers


The Hsu-nami's Myspace friend distribution in Asia

Figure 0: The Hsu-nami’s Myspace friend distribution in Asia

In my last post, I introduced the idea of using webscraping for the purpose of acquiring relevant ethnographic data. In this second post, I will concentrate on the next step of the ethnographic process: data processing and interpreting. Remember The Hsu-nami, the band that I talked in the last post? The image above is a screenshot of their Myspace friend distribution, a map that I created for analyzing the geography of their community. This post is about the value of creating such maps.Read More… On Digital Ethnography: mapping as a mode of data discovery (2 of 4)

Qualitative research is not research at all?

Image of building with torn sign reading "Rant"

Rant this way ~ Photo by Nesster, CC BY-SA

Heather pointed out these comments by Bob Garfield from a recent broadcast of On the Media (“Sentiment Analysis Reveals How the World is Feeling“):

I’ve been arguing for years that qualitative research, focus groups and the like, are not research at all. They don’t generate data. It’s statistically insignificant, easily manipulated, and from my perspective just as likely to be exactly wrong as exactly right.

Garfield then adds:

But it seems to me that what you’re dealing with is something that deals with all of my objections, because you’ve got the world’s largest focus group.

Sigh. This is wrong on so many levels, and anyone who is interested in ethnography already knows why, but just to touch on some of the problems:

  • Qualitative research can generate data. The tweets used in Johann Bollen‘s [1] sentiment analysis (the subject of this OTM episode), interview transcripts, field notes, photos, audiorecordings, visual recordings: all data. Some research within the qualitative tradition also generates numeric data [2] by, for example, calculating measures of intercoder reliability, or in the analysis of card sorting tasks.
  • There is a lot more to statistical testing than statistical significance (and some controversy among statisticians about overuse of significance testing). There is also more to quantitative analysis than statistical testing. Bayesian inference, for example, could be thought of as quantitative analysis that is not necessarily statistical testing.
  • Similarly, qualitative research cannot be reduced to “focus groups and the like”. The purposes, strengths and weaknesses of focus groups are very different from those of other qualitative methods such as [participant-]observation and one-on-one interviews [3].
  • Using statistical testing as a marker for what is or is not research omits work that has formed the backbone of the sciences such as classical experimentation, disconfirmation by example, comparative methods for creating typologies and analyzing artifacts, etc.
  • “Easily manipulated”? Yup, research findings in general can be manipulated. Statistical testing is really easy to manipulate.

Garfield’s statement also suggests either ignorance or dismissal of mixed methods research, which, I would argue, is increasingly becoming a gold standard for research in some fields, such as public health.

There’s a hint at why mixed methods have become so important in public health research in Garfield’s comment about “the world’s largest focus group.” Bollen’s use of a large collection of corpora is well-suited to his purposes, but other purposes can require different or additional kinds of work.

Let’s say I do a giant public health survey. If a minority in my sample doesn’t interpret a word or phrase in the same way that the majority interprets it, if some questions make no sense at all from their perspective, if people writing the survey have no idea what minority members’ concerns or experiences even are much less how they’re relevant to health, then the survey results will be meaningless for that social group.

There is no such thing as a survey that is not culturally informed. Without ethnographic work and awareness, surveys, public health information and campaigns, etc., will likely be culturally informed by those who are most powerful and/or in the majority. Qualitative research is indispensable for addressing structural health inequities affecting the less powerful. Should ethnographic work focused on these inequities be patted on the head and assured that it’s nice, but it’s not-really-research? Fortunately, the NIH does not think so.

Sometimes I wonder if people miss how widespread and useful qualitative work is because it can be invisible (see Tricia‘s related post about the ‘Invisibility of Ethnography‘). A couple recent episodes of On the Media may clarify the kind of research that Garfield is dismissing here, while at the same time (perhaps unknowingly?) depending on it.

On Nov. 4th, Garfield spoke with social media researcher danah boyd about “Parents helping kids lie online.” The paper [4] behind this interview presents quantitative summaries of survey data — “real” research, perhaps, to Garfield.  But hmm, how and why was this survey designed?

Read More… Qualitative research is not research at all?