Abstract
We describe a novel dynamic method for collaborative virtual environments designed for mobile devices and evaluated in a mobile context. Participants interacted in pairs remotely and through touch while walking in three different feedback conditions: 1) visual, 2) audio-tactile, 3) spatial audio-tactile. Results showed the visual baseline system provided higher shared awareness, efficiency and a strong learning effect. However, and although very challenging, the eyes-free systems still offered the ability to build joint awareness in remote collaborative environments, particularly the spatial audio one. These results help us better understand the potential of different feedback mechanisms in the design of future mobile collaborative environments.
Original language | English |
---|---|
Title of host publication | Mobile HCI'11 - Procs13th Int Conf on Human-Computer Interaction with Mobile Devices and Services |
Publisher | ACM Press |
Pages | 499-502 |
Number of pages | 4 |
ISBN (Print) | 9781450305419 |
DOIs | |
Publication status | Published - 27 Oct 2011 |
Event | 13th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI 2011 - Stockholm, Sweden Duration: 30 Aug 2011 → 2 Sept 2011 |
Conference
Conference | 13th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI 2011 |
---|---|
Country/Territory | Sweden |
City | Stockholm |
Period | 30/08/11 → 2/09/11 |
Keywords
- collaborative virtual environments
- mobile shared interaction
- social presence
- spatial audio
- tactile