Robodog. Exploring the Spot robot as a “guide dog” for visually impaired people: Vestibular and proprioceptive sensations as sociality

Research output: Contribution to conferenceConference abstract for conferenceResearch

Robodog. Exploring the Spot robot as a “guide dog” for visually impaired people
Vestibular and proprioceptive sensations as sociality

The argument in this paper is that the practice of coordinating walking together is based on balance as a semiotic resource . It shows how agency, perception and navigation is distributed nad that the dog harness establish a “mediated haptic sociality”. Balance is a resource (vestibular and proprioceptic sensation). Spatial organization and walking have been studied by looking at territories, Proxemics, F-formation, Focused encounter (Hall, 1966; Kendon, 1976; 1990; Scheflen, 1976; Mondada, 2009); Walking & mobility (Lee and Watson, 1993; Liberman, 2019; Psathas, 1976; Ryave and Schenkein, 1974; Mcilvenny, Broth, Haddington, 2009); Spatial arrangements with robots (but not legged robots) (Alac et al, 2011; Gehle et al, 2017; Yamazaki et al, 2019; Yousuf et al, 2019; Due, 2021; Boudouraki et al, 2021). It has been stated that: ”The robot’s embodiment becomes the resources for projection” (Yamazaki et al. 2019). But what if you are blind? Haptics becomes the ressource.

In this paper I study assistive robotics: Four-legged robots with responsive collaboration. A cobot, or collaborative robot, is a robot intended for direct human robot interaction within a shared space, or where humans and robots are in close proximity (The International Federation of Robotics (IFR)). I study the Spot walking, legged robot.

We know that people use sensory resources for perception as practical action. Multisensoriality: employing several sensory resources for action construction as observable action (Mondada, 2019); Distributed perception: multimodal perception-related practical actions provided by other agents (Due, 2021); Touch in interaction: Intercorporeality (Meyer, Streeck, Jordan, 2017) and Haptic sociality (M.H. Goodwin, 2017). Bodily sensation and sense of movement and balance are observable phenomena. The vestibular system is a sensory system in the inner ear used for coordinating movement with balance. Much related to eye movement and visual perception.
Proprioception (kinesthesia) is a sensory system in the muscles, tendons and joints: the sense of self-movement and the relative position of neighbouring parts of the body.
The analysis shows how the harness is mediating a haptic sociality between the robot and the participant = pace and direction is observable in the arms position (stretched/bended). The robot is pulling (opposite the dog which is also steered); little reciprocity and cooperation. The instructor is part of the triadic assemblage by controlling and talking (VIP+Robodog+operator) = action and agency is distributed across the phenomenal field. Accountability is distributed across the assemblage. The short white cane is a resource used for additional tactile feedback as part of the emerging multimodal and multisensorial gestalt: cane tactility, feet tactility, verbal instructions, mediated haptic feedback from harness, body adjustments from vestibular/proprioceptive sensations in one emerging gestalt. Being pulled sideways away from trajectory is accounted for with a response cry by VIP. And treated with accounts and adjustments by Operator = imbalance is accountable and dispreferred. Vestibular and proprioceptive sensations are not ”just” motor functions but observable part of human sociality. Adjusting movement and balancing is observable and accountable action produced in sequential environments.

This paper contributes to:
Perception as practical and accountable action build within a distributed phenomenal field
Agency as embedded in emerging, evolving/dissolving assemblages
Assemblages as participants merging of heterogenous materials and semiotic elements into accountable collections
Haptic sociality as a mediated and distributed form; possible with non-human agents
Vestibular and proprioceptive sensing as observable practical action and vital element of human sociality

Due, B. L. (2021). Interspecies intercorporeality and mediated haptic sociality: Distributing perception with a guide dog. Visual Studies, 0(0), 1–14.
Due, B. L., & Lange, S. B. (2018). Semiotic resources for navigation: A video ethnographic study of blind people’s uses of the white cane and a guide dog for navigating in urban areas. Semiotica, 2018(222), 287–312.
Goode, D. (2007). Playing with My Dog Katie: An Ethnomethodological Study of Dog-human Interaction. Purdue University Press.
Haraway, D. (2003). The Companion Species Manifesto: Dogs, People, and Significant Otherness (M. Begelke, Ed.). Prickly Paradigm Press.
Mondémé, C. (2011). Dog-human sociality as mutual orientation. IIEMCA 2011.
Mondémé, C. (2013). Formes d’interactions sociales entre hommes et chiens Une approche praxéologique des relations interspécifiques. Ecole Normale Supérieure de Lyon Ecole Doctorale Lettres, Langues, Linguistique et Arts (ED 484) Laboratoire ICAR (UMR 5191) – Interactions, Corpus, Apprentissage, Représentation.
Mondémé, C. (2017, May). Moving as an interspecies unit: Accounting for im-mobility. MOBSIN workshop (Telecom ParisTech).
Mondémé, C. (2020a). Touching and petting: Exploring “haptic sociality” in interspecies interaction. In A. Cekaite & L. Mondada (Eds.), Touch in Social Interaction. Routledge.
Mondémé, C. (2020b). La socialité interspécifique: Une analyse multimodale des interactions homme-chien. Lambert-Lucas.

Hall, E. T. (1966). The Hidden Dimension. Anchor.
Kendon, A. (1976). The F-Formation System: The Spatial Organization of Social Encounters. Man-Environment Systems, 6, 291–296.
Scheflen, A. E. (1976). Human Territories: How We Behave in Space-Time. Prentice-Hall.
Kendon, A. (1990). Conducting Interaction: Patterns of Behavior in Focused Encounters. Cambridge University Press.
Mondada, L. (2009). Emergent focused interactions in public places: A systematic analysis of the multimodal achievement of a common interactional space. Journal of Pragmatics, 41(10), 1977–1997.
Lee, J. R. E., & Watson, R. (1993). Interaction in public space: Final report to the plan urbain. France: Plan Urbain.
Liberman, K. (2019). A study at 30th street. Language & Communication, 65, 92–104.
McIlvenny, P., Broth, M., & Haddington, P. (2009). Communicating place, space and mobility. Journal of Pragmatics, 41(10), 1879–1886.
Psathas, G. (1976). Mobility, Orientation, and Navigation: Conceptual and Theoretical Considerations. New Outlook for the Blind, 70(9), 385–391.
Ryave, L. A., & Schenkein, J. N. (1974). Notes on the Art of Walking. In R. Turner (Ed.), Ethnomethodology (pp. 265–278). Penguin Books.

Alač, M., Movellan, J., & Tanaka, F. (2011). When a robot is social: Spatial arrangements and multimodal semiotic engagement in the practice of social robotics. Social Studies of Science, 41(6), 893–926.
Gehle, R., Pitsch, K., Dankert, T., & Wrede, S. (2017). How to Open an Interaction Between Robot and Museum Visitor?: Strategies to Establish a Focused Encounter in HRI. Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, 187–195.
Yamazaki, A., Yamazaki, K., Arano, Y., Saito, Y., Iiyama, E., Fukuda, H., Kobayashi, Y., & Kuno, Y. (2019). Interacting with Wheelchair Mounted Navigator Robot.
Yousuf, M. A., Kobayashi, Y., Kuno, Y., Yamazaki, K., & Yamazaki, A. (2019). Social interaction with visitors: Mobile guide robots capable of offering a museum tour. IEEJ Transactions on Electrical and Electronic Engineering, 14(12), 1823–1835.
Due, B. L. (2021). RoboDoc: Semiotic resources for achieving face-to-screenface formation with a telepresence robot. Semiotica, 238, 253–278.
Boudouraki, A., Fischer, J. E., Reeves, S., & Rintel, S. (2021). “I can’t get round”: Recruiting Assistance in Mobile Robotic Telepresence. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW3), 248:1-248:21.
Original languageEnglish
Publication date2021
Publication statusPublished - 2021
Event6th Copenhagen Multimodality Day 2021: AI in interaction - UCPH, Copenhagen , Denmark
Duration: 1 Oct 2021 → …


Conference6th Copenhagen Multimodality Day 2021
Period01/10/2021 → …
Internet address

ID: 283532818