Tag Archives: Stuart Geiger

Cheering up the chatbot


The speech to text tool on my phone is convinced that “ethnography” = “not greasy.” (At least “not greasy” tends to be a postive thing?) Generally STT and voice commands work great on it though. You have to talk to it the right way: Enunciate; dramatic pauses between each word; don’t feed it too many words at once. The popular speech recognition application Dragon NaturallySpeaking emphasizes that users train the system to recognize their voices, but there’s always an element of the system training its users how to talk.

For entertainment purposes, it’s best to avoid the careful pauses and smush things together, producing text message gems like “Send me the faxable baby.”  It’s the mismatches between human intention and machine representation that can make using natural language interaction tools like STT, chatbots and speech prediction both frustrating and hilarious. When it’s bad, it’s really really good.

I’ve been playing with the game Cheer up the Chatbot the last couple days (from RRRR, “Where the games play you”).

Chatbot has an unusual way of interacting with people, as so many chatbots do.

Screen explaining Chatbot's mental disorders

Screen explaining Chatbot’s mental disorders

Understandably, Chatbot is sad.

chatbotissad

Poor chatbot

 

The goal is to get Chatbot to smile.

Open-ended questions make robots happy

Open-ended questions make robots happy

 

The game is a mix of bot and human-to-human chat, where you switch between talking to the game’s bot and to different players who are presented as the “Chatbot” speaker to each other.  When you hit a moment where there are enough players with different agendas online — including some who don’t know how the game works, some presenting as Chatbot, and some presenting as people — it can get weird.

Read More… Cheering up the chatbot

The ethnography of robots


Heather Ford spoke with Stuart Geiger, PhD student at the UC Berkeley School of Information, about his emerging ideas about the ethnography of robots. “Not the ethnography of robotics (e.g. examining the humans who design, build, program, and otherwise interact with robots, which I and others have been doing),” wrote Geiger, “but the ways in which bots themselves relate to the world”. Geiger believes that constructing and relating an emic account of the non-human should be the ultimate challenge for ethnography but that he’s getting an absurd amount of pushback from it.” He explains why in this fascinating account of what it means to study the culture of robots.

Stuart Geiger speaking about bots on Wikipedia at the CPoV conference by Institute of Network Cultures on Flickr

HF: So, what’s new, almost-Professor Geiger?

SG: I just got back from the 4S conference — the annual meeting of the Society for the Social Study of Science — which is pretty much the longstanding home for not just science studies but also Science and
Technology Studies. I was in this really interesting session featuring some really cool qualitative studies of robots, including two ethnographies of robotics. One of the presenters, Zara Mirmalek, was looking at the interactions between humans and robots within a modified framework from intercultural communication and workplace studies.

I really enjoyed how she was examining robots as co-workers from different cultures, but it seems like most people in the room didn’t fully get it, thinking it was some kind of stretched metaphor. People kept giving her the same feedback that I’ve been given — isn’t there an easier way you can study the phenomena that interest you without attributing culture to robots themselves? But I saw where she was going and asked her about doing ethnographic studies of robot culture itself, instead of the culture of people who interact with robots — and it seemed like half the room gave a polite chuckle. Zara, however, told me that she loved the idea and we had a great chat afterwards about this.Read More… The ethnography of robots