Measuring interaction proxemics with wearable light tags
The proxemics of social interactions (e.g., body distance, relative orientation) in!uences many aspects of our everyday life: from patients’ reactions to interaction with physicians, successes in job interviews, to effective teamwork. Traditionally, interaction proxemics has been studied via questio...
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
ACM
2018
|
| Subjects: | |
| Online Access: | https://eprints.nottingham.ac.uk/50762/ |
| _version_ | 1848798334283677696 |
|---|---|
| author | Montanari, Alessandro Tian, Zhao Francu, Elena Lucas, Benjamin Jones, Brian Zhou, Xia Mascolo, Cecilia |
| author_facet | Montanari, Alessandro Tian, Zhao Francu, Elena Lucas, Benjamin Jones, Brian Zhou, Xia Mascolo, Cecilia |
| author_sort | Montanari, Alessandro |
| building | Nottingham Research Data Repository |
| collection | Online Access |
| description | The proxemics of social interactions (e.g., body distance, relative orientation) in!uences many aspects of our everyday life: from patients’ reactions to interaction with physicians, successes in job interviews, to effective teamwork. Traditionally, interaction proxemics has been studied via questionnaires and participant observations, imposing high burden on users, low scalability and precision, and often biases. In this paper we present Protractor, a novel wearable technology for measuring interaction proxemics as part of non-verbal behavior cues with# ne granularity. Protractor employs near-infrared light to monitor both the distance and relative body orientation of interacting users. We leverage the characteristics of near-infrared light (i.e., line-of-sight propagation) to accurately and reliably identify interactions; a pair of collocated photodiodes aid the inference of relative interaction angle and distance. We achieve robustness against temporary blockage of the light channel (e.g., by the user’s hand or clothes) by designing sensor fusion algorithms that exploit inertial sensors to obviate the absence of light tracking results. We fabricated Protractor tags and conducted real-world experiments. Results show its accuracy in tracking body distances and relative angles. The framework achieves less than 6 error 95% of the time for measuring relative body orientation and 2.3-cm – 4.9-cm mean error in estimating interaction distance. We deployed Protractor tags to track user’s non-verbal behaviors when conducting collaborative group tasks. Results with 64 participants show that distance and angle data from Protractor tags can help assess individual’s task role with 84.9% accuracy, and identify task timeline with 93.2% accuracy. |
| first_indexed | 2025-11-14T20:18:07Z |
| format | Article |
| id | nottingham-50762 |
| institution | University of Nottingham Malaysia Campus |
| institution_category | Local University |
| language | English |
| last_indexed | 2025-11-14T20:18:07Z |
| publishDate | 2018 |
| publisher | ACM |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | nottingham-507622018-03-28T12:24:50Z https://eprints.nottingham.ac.uk/50762/ Measuring interaction proxemics with wearable light tags Montanari, Alessandro Tian, Zhao Francu, Elena Lucas, Benjamin Jones, Brian Zhou, Xia Mascolo, Cecilia The proxemics of social interactions (e.g., body distance, relative orientation) in!uences many aspects of our everyday life: from patients’ reactions to interaction with physicians, successes in job interviews, to effective teamwork. Traditionally, interaction proxemics has been studied via questionnaires and participant observations, imposing high burden on users, low scalability and precision, and often biases. In this paper we present Protractor, a novel wearable technology for measuring interaction proxemics as part of non-verbal behavior cues with# ne granularity. Protractor employs near-infrared light to monitor both the distance and relative body orientation of interacting users. We leverage the characteristics of near-infrared light (i.e., line-of-sight propagation) to accurately and reliably identify interactions; a pair of collocated photodiodes aid the inference of relative interaction angle and distance. We achieve robustness against temporary blockage of the light channel (e.g., by the user’s hand or clothes) by designing sensor fusion algorithms that exploit inertial sensors to obviate the absence of light tracking results. We fabricated Protractor tags and conducted real-world experiments. Results show its accuracy in tracking body distances and relative angles. The framework achieves less than 6 error 95% of the time for measuring relative body orientation and 2.3-cm – 4.9-cm mean error in estimating interaction distance. We deployed Protractor tags to track user’s non-verbal behaviors when conducting collaborative group tasks. Results with 64 participants show that distance and angle data from Protractor tags can help assess individual’s task role with 84.9% accuracy, and identify task timeline with 93.2% accuracy. ACM 2018-03-31 Article PeerReviewed application/pdf en https://eprints.nottingham.ac.uk/50762/1/14387.pdf Montanari, Alessandro, Tian, Zhao, Francu, Elena, Lucas, Benjamin, Jones, Brian, Zhou, Xia and Mascolo, Cecilia (2018) Measuring interaction proxemics with wearable light tags. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2 (1). pp. 1-30. ISSN 2474-9567 Human-centered computing→Ubiquitous and mobile computing systems and tools; Computer systems organization→Embedded systems; Face-to-face interactions non-verbal behaviors light sensing https://dl.acm.org/citation.cfm?doid=3200905.3191757 doi:10.1145/3191757 doi:10.1145/3191757 |
| spellingShingle | Human-centered computing→Ubiquitous and mobile computing systems and tools; Computer systems organization→Embedded systems; Face-to-face interactions non-verbal behaviors light sensing Montanari, Alessandro Tian, Zhao Francu, Elena Lucas, Benjamin Jones, Brian Zhou, Xia Mascolo, Cecilia Measuring interaction proxemics with wearable light tags |
| title | Measuring interaction proxemics with wearable light tags |
| title_full | Measuring interaction proxemics with wearable light tags |
| title_fullStr | Measuring interaction proxemics with wearable light tags |
| title_full_unstemmed | Measuring interaction proxemics with wearable light tags |
| title_short | Measuring interaction proxemics with wearable light tags |
| title_sort | measuring interaction proxemics with wearable light tags |
| topic | Human-centered computing→Ubiquitous and mobile computing systems and tools; Computer systems organization→Embedded systems; Face-to-face interactions non-verbal behaviors light sensing |
| url | https://eprints.nottingham.ac.uk/50762/ https://eprints.nottingham.ac.uk/50762/ https://eprints.nottingham.ac.uk/50762/ |