Natural Locomotion Interfaces – With a Little Bit of Magic!
DOI:
https://doi.org/10.5753/jis.2011.588Abstract
The mission of the Immersive Media Group (IMG) is to develop virtual locomotion user interfaces which allow humans to experience arbitrary 3D environments by means of the natural walking metaphor. Traveling through immersive virtual environments (IVEs) by means of real walking is an important activity to increase naturalness of virtual reality (VR)-based interaction. However, the size of the virtual world often differs from the size of the tracked lab space so that a straightforward implementation of omni-directional and unlimited walking is not possible. Redirected walking is one concept to address this issue by inconspicuously guiding the user on a physical path that may differ from the path the user perceives in the virtual world. For example, intentionally rotating the virtual camera to one side causes the user to unknowingly compensate by walking on a circular arc into the opposite direction.
In the scope of the LOCUI project, which is funded by the German Research Foundation, we analyze how gains of locomotor speed, turns and curvatures can gradually alter the physical trajectory with respect to the path perceived in the virtual world without the users observing any discrepancy. Thus, users can be guided in order to avoid collisions with physical obstacles (e.g., lab walls) or they can be guided to arbitrary locations in the physical space. For example, if the user approaches a virtual object, she can be guided to a real proxy prop that is registered to and aligned with its virtual counterpart. Hence, the user can interact with a virtual object by touching the corresponding real-world proxy prop that provides haptic feedback. Based on the results of psychophysical experiments we plan With such a user interface it becomes possible to intuitively interact with any virtual object by touching registered real-world props.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
JIS is free of charge for authors and readers, and all papers published by JIS follow the Creative Commons Attribution 4.0 International (CC BY 4.0) license.