Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces): 2. Atypical Interaction

Research output: Contribution to conferenceConference abstract for conferenceResearchpeer-review

Standard

Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces) : 2. Atypical Interaction. / Due, Brian Lystgaard.

2021. Abstract from NORDISCO.

Research output: Contribution to conferenceConference abstract for conferenceResearchpeer-review

Harvard

Due, BL 2021, 'Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces): 2. Atypical Interaction', NORDISCO, 17/11/2021 - 19/11/2021.

APA

Due, B. L. (2021). Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces): 2. Atypical Interaction. Abstract from NORDISCO.

Vancouver

Due BL. Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces): 2. Atypical Interaction. 2021. Abstract from NORDISCO.

Author

Due, Brian Lystgaard. / Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces) : 2. Atypical Interaction. Abstract from NORDISCO.

Bibtex

@conference{0cb8e05c60b846aba1734c870559fcc6,
title = "Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces): 2. Atypical Interaction",
abstract = "Intentional stance as observable order: when blind people adapt to an AI2. Atypical InteractionBlind people are increasingly using systems like Google Home to control music, TV, etc. because thevoice (instead of the visual) affords the kind of interaction, most suitable for people with visualimpairment. EMCA studies have revealed, that interactions with these systems cannot be understoodas a sole activity, but most be understood within the embedded context of everyday life, and thatthese conversational systems are not really being conversational (Porcheron et al., 2018). However,such studies have not looked into the multimodal, spatial and embodied nature of interactions in andaround the device, and have not dealt with atypical populations like e.g. blind people.Through ethnomethodological conversation analysis and video ethnography (Heath et al., 2010), thispaper shows how problems with making the device understand commands is dealt with throughrepairing actions, that are designed to adjust to the expected logic of the device. This research thuscontributes to the discussions on intentional stance (Dennett, 1989)by respecifying it as an sociallyaccountable accomplishment.Dennett, D. C. (1989). The Intentional Stance (Reprint edition). A Bradford Book.Heath, C., Hindmarsh, J., & Luff, P. (2010). Video in Qualitative Research. SAGE Publications Ltd.Porcheron, M., Fischer, J. E., Reeves, S., & Sharples, S. (2018). Voice Interfaces in Everyday Life |Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. CHI {\textquoteright}18:Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Paper no. 640, 1–12.",
author = "Due, {Brian Lystgaard}",
year = "2021",
month = nov,
day = "17",
language = "English",
note = "null ; Conference date: 17-11-2021 Through 19-11-2021",
url = "https://www.nordisco2021.se/program-2/",

}

RIS

TY - ABST

T1 - Intentional stance as observable order: when blind people adapt to an AI (VUI-SPEAK: There is Nothing Conversational about ”Conversational User Interfaces)

AU - Due, Brian Lystgaard

PY - 2021/11/17

Y1 - 2021/11/17

N2 - Intentional stance as observable order: when blind people adapt to an AI2. Atypical InteractionBlind people are increasingly using systems like Google Home to control music, TV, etc. because thevoice (instead of the visual) affords the kind of interaction, most suitable for people with visualimpairment. EMCA studies have revealed, that interactions with these systems cannot be understoodas a sole activity, but most be understood within the embedded context of everyday life, and thatthese conversational systems are not really being conversational (Porcheron et al., 2018). However,such studies have not looked into the multimodal, spatial and embodied nature of interactions in andaround the device, and have not dealt with atypical populations like e.g. blind people.Through ethnomethodological conversation analysis and video ethnography (Heath et al., 2010), thispaper shows how problems with making the device understand commands is dealt with throughrepairing actions, that are designed to adjust to the expected logic of the device. This research thuscontributes to the discussions on intentional stance (Dennett, 1989)by respecifying it as an sociallyaccountable accomplishment.Dennett, D. C. (1989). The Intentional Stance (Reprint edition). A Bradford Book.Heath, C., Hindmarsh, J., & Luff, P. (2010). Video in Qualitative Research. SAGE Publications Ltd.Porcheron, M., Fischer, J. E., Reeves, S., & Sharples, S. (2018). Voice Interfaces in Everyday Life |Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. CHI ’18:Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Paper no. 640, 1–12.

AB - Intentional stance as observable order: when blind people adapt to an AI2. Atypical InteractionBlind people are increasingly using systems like Google Home to control music, TV, etc. because thevoice (instead of the visual) affords the kind of interaction, most suitable for people with visualimpairment. EMCA studies have revealed, that interactions with these systems cannot be understoodas a sole activity, but most be understood within the embedded context of everyday life, and thatthese conversational systems are not really being conversational (Porcheron et al., 2018). However,such studies have not looked into the multimodal, spatial and embodied nature of interactions in andaround the device, and have not dealt with atypical populations like e.g. blind people.Through ethnomethodological conversation analysis and video ethnography (Heath et al., 2010), thispaper shows how problems with making the device understand commands is dealt with throughrepairing actions, that are designed to adjust to the expected logic of the device. This research thuscontributes to the discussions on intentional stance (Dennett, 1989)by respecifying it as an sociallyaccountable accomplishment.Dennett, D. C. (1989). The Intentional Stance (Reprint edition). A Bradford Book.Heath, C., Hindmarsh, J., & Luff, P. (2010). Video in Qualitative Research. SAGE Publications Ltd.Porcheron, M., Fischer, J. E., Reeves, S., & Sharples, S. (2018). Voice Interfaces in Everyday Life |Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. CHI ’18:Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Paper no. 640, 1–12.

UR - https://www.nordisco2021.se/wp-content/uploads/sites/77/2021/11/Book-of-Abstracts-Nordisco-2021.pdf

M3 - Conference abstract for conference

Y2 - 17 November 2021 through 19 November 2021

ER -

ID: 284901415