Achieving joint perception of an object from multisensory resources: Visually impaired person’s tactile explorations in the context of instructor’s verbal descriptions.

Research output: Contribution to conferenceConference abstract for conferenceResearchpeer-review

Atypical interactional sequences may arise when visually impaired people (VIP) interact with seeing people. In this paper we explore a particular type of instructional sequence that is ubiquitous when VIP interact with ICT consultants about new technological aids in the process of familiarizing with it. This paper explores members orientation towards a Google Home speaker and a pair of Envision smart glasses. The VIP’s basic questions are: how does it work, what is its material form, what are the functionalities, etc. Familiarizing with the device involves, as we will show in this presentation, instructional sequences where the consultant produce verbal descriptions and the VIP responds with embodied explorations. Based on EMCA and video recordings (Heath et al., 2010; Mondada, 2019) the paper shows how participants co-construct an observable understanding of the object’s material and functional features based on the co-construction of joint perception from different sensory resources. We show the organization of how participants monitor each other and shifts between different sequential organizations: the instructor producing verbal descriptions of a specific feature, and the VIP producing tactile explorations of the technology as a response, or vice versa. The sequences are different, but also alike with regards to organization of adjacency pairs: there is a conditional relevance (Schegloff, 1968) between ICT’s verbal descriptions and VIP’s tactile explorations. We thus show a profound social order in which the participants jointly achieve perception of the object (cf. Due, 2021). We discuss how the intertwined nature of the sensory resources and the creative building on each other’s distributed perception is vital for accomplishing the activity and thus establish possibility for social inclusion in mundane activities of daily living.

Keyword: AI-technologies; visual impaired person; instructional sequences; verbal descriptions; embodied explorations, perception,

Due, B. L. (2021). Distributed Perception: Co-Operation between Sense-Able, Actionable, and Accountable Semiotic Agents. Symbolic Interaction, 44(1), 134–162. https://doi.org/10.1002/symb.538
Heath, C., Hindmarsh, J., & Luff, P. (2010). Video in Qualitative Research. SAGE Publications Ltd.
Mondada, L. (2019). Contemporary issues in conversation analysis: Embodiment and materiality, multimodality and multisensoriality in social interaction. Journal of Pragmatics, 145, 47–62. https://doi.org/10.1016/j.pragma.2019.01.016
Schegloff, E. A. (1968). Sequencing in Conversational Openings. American Anthropologist, 70(6), 1075–1095.

Original languageEnglish
Publication date2022
Publication statusPublished - 2022
Event Atypical Interaction Conference 2022 - Newcastle University, Newcastle, United Kingdom
Duration: 27 Jun 202229 Jun 2022
https://conferences.ncl.ac.uk/aic2022/

Conference

Conference Atypical Interaction Conference 2022
LocationNewcastle University
CountryUnited Kingdom
CityNewcastle
Period27/06/202229/06/2022
Internet address

ID: 311719667