Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces): 2. Atypical Interaction

Research output: Contribution to conferenceConference abstract for conferenceResearchpeer-review

Blind people are increasingly using systems like Google Home to control music, TV, etc. because the
voice (instead of the visual) affords the kind of interaction, most suitable for people with visual
impairment. EMCA studies have revealed, that interactions with these systems cannot be understood
as a sole activity, but most be understood within the embedded context of everyday life, and that
these conversational systems are not really being conversational (Porcheron et al., 2018). However,
such studies have not looked into the multimodal, spatial and embodied nature of interactions in and
around the device, and have not dealt with atypical populations like e.g. blind people.
Through ethnomethodological conversation analysis and video ethnography (Heath et al., 2010), this
paper shows how problems with making the device understand commands is dealt with through
repairing actions, that are designed to adjust to the expected logic of the device. This research thus
contributes to the discussions on intentional stance (Dennett, 1989)by respecifying it as an socially
accountable accomplishment.
Original languageEnglish
Publication date17 Nov 2021
Publication statusPublished - 17 Nov 2021
EventNORDISCO: 6th Interdisciplinary Conference on Discourse and Interaction -
Duration: 17 Nov 202119 Nov 2021


Internet address

ID: 284901415