Getting Around in Cyber-Space: The Perception of Space in Virtual Reality†
With the broader availability of low-cost VR-devices, the diversity of applications as well as user groups is growing. At the same time, AR and VR applications are leaving the (professional) lab environment and are entering domains where lesser assumptions can be made about the technical setup. In this project we want to address the particular problem that space restrictions in the real environment bear the risk to completely break the illusion of virtual reality if the dimensions of the virtual environment don’t match. More concretely, for the perception of space in virtual environments it is essential to be able to freely move/walk around. If the real environment (eg. the trackable area in an office) does not provide enough space, the illusion breaks down as soon as the user hits a (real) obstacle.
Our methodological approach to overcome these problems is to exploit the fact that a user wearing a head mounted display can only see the virtual environment but senses the real environment with his body. Mismatches between real body movements and visual responses on the display are un-noticeable if they are small and the user’s attention is reduced (eg. because of head or eye move-ment). Hence our goal is to track the user and to rate the current level of distraction. Then, if reduced attention is detected, we can slightly shift the virtual environment relative to the real one in order to reduce the probability of an obstacle hit. In the first phase, we will restrict to passive measures, like head and eye tracking. In the second phase we will explore active methods such as adding obstacles in the virtual environment or letting virtual characters cross the user’s way for distraction.