Smartphone tooling: Achieving perception by positioning a smartphone for object scanning

Publikation: Bidrag til bog/antologi/rapportBidrag til bog/antologiForskningfagfællebedømt

People have been using tools for thousands of years. These practices of “tooling” have been described as having a “mechanical effect” on an object (e.g., chopping wood). In this chapter we propose that tooling may also have an “informational effect”. To make this argument we explore how visually impaired people (VIP) carry out physical shopping in grocery stores using their smartphones and the SeeingAI application (app). Using a smartphone for scanning means using it as a tool, hence the chapter title “smartphone tooling”. The data consists of a collection of cases in which a VIP is using the smartphone and app to scan products, and the app then provides audible information. The chapter is based on video ethnographic methodology and ethnomethodological multimodal conversation analysis. The chapter contributes to studies of tools and object-centred sequences by showing how VIPs achieve perception of relevant object information in and through a practice we suggest calling “positioning for object scanning”. This is configured by three distinct actions: (1) aligning, (2) adjusting and (3) inspecting. Studying the practices of VIPs enables us to establish new understandings about the accomplishment of spatial relations between body, object and technology in situ, without visual perception. This research contributes to EM/CA studies of perception as practical action, visual impairment and object-centred sequences.

OriginalsprogEngelsk
TitelPeople, Technology, and Social Organization : Interactionist Studies of Everyday Life
UdgivelsesstedAbingdon, Oxon
ForlagRoutledge
Publikationsdato1 jan. 2023
Sider250-273
ISBN (Trykt)9781032230689
ISBN (Elektronisk)9781000967074
DOI
StatusUdgivet - 1 jan. 2023

Bibliografisk note

Publisher Copyright:
© 2024 selection and editorial matter, Dirk vom Lehn, Will Gibson and Natalia Ruiz-Junco; individual chapters, the contributors.

ID: 373614055