Reducing Simulator Sickness with Perceptual Camera Control

Ping Hu1     Qi Sun2     Piotr Didyk3     Li-Yi Wei2     Arie Kaufman1

1Stony Brook University    2Adobe Research    4Università della Svizzera italiana


Virtual-reality provides an immersive environment but can induce cybersickness due to the discrepancy between visual and vestibular cues. To avoid this problem, the movement of the virtual camera needs to match the motion of the user in the real world. Unfortunately, this is usually difficult due to the mismatch between the size of the virtual environments and the space available to the users in the physical domain. The resulting constraints on the camera movement significantly hamper the adoption of virtual-reality headsets in many scenarios and make the design of the virtual environments very challenging. In this work, we study how the characteristics of the virtual camera movement (e.g., translational acceleration and rotational velocity) and the composition of the virtual environment (e.g., scene depth) contribute to perceived discomfort. Based on the results from our user experiments, we devise a computational model for predicting the magnitude of the discomfort for a given scene and camera trajectory. We further apply our model to a new path planning method which optimizes the input motion trajectory to reduce perceptual sickness. We evaluate the effectiveness of our method in improving perceptual comfort in a series of user studies targeting different applications. The results indicate that our method can reduce the perceived discomfort while maintaining the fidelity of the original navigation, and perform better than simpler alternatives.



We would like to thank Xiaojun Bi and Suwen Zhu for experiment analysis discussion, and the anonymous reviewers for their valuable suggestions. This project is partially supported by National Science Foundation grants NRT1633299, CNS1650499, and a gift from Adobe. This project also has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (grant agreement N° 804226 – PERDY).