Virtual Reality (VR) technology lets users train for high-stakes situations in the safety of a virtual environment (VE). Yet user movement through such an environment can cause postural instability and motion sickness. These issues are often attributed to how the brain processes visual self-motion information in VEs. Low-contrast conditions, like those caused by dense fog, are known to affect observers’ self-motion perception, but it is not clear how posture, motion sickness, and navigation performance are affected by this kind of visual environment degradation. Ongoing work using VR focuses on three aspects of this problem. First, we verify the effects in VR of low contrast on visual speed estimates. Second, we test how contrast reduction affects posture control, motion sickness, and performance during a VR navigation task. Third, we examine whether it is useful to augment low-contrast conditions with high-contrast visual aids in the environment.
With the increasing adoption of mixed reality technology, it is crucial to identify and avoid displays that cause noxious effects among users, such as loss of balance or motion sickness. Towards this end, we examined the effects of sinusoidal modulations of viewpoint on standing posture. These modulations varied the position of the user’s viewpoint in a virtual environment (VE) over time along either the left-right or the forwards-backwards direction; each had a chosen amplitude and temporal frequency. We measured the resulting change in posture at the frequency of visual stimulation, the socalled steady-state visually evoked posture response (SSVEPR), and used a signal-to-noise ratio (SNR) method to assess SSVEPR strength. These posture responses are described well by sigmoid functions of viewpoint modulation amplitude, allowing one to estimate the lowest amplitude of the visual stimulus that generates a just-detectable posture response. Results suggest that the visuo-postural control system’s sensitivity to viewpoint modulation increases with the frequency of the stimulus. Results also suggest that there is a speed threshold for viewpoint movement that must be met or exceeded if a posture response is to be produced. The results are similar for both left-right and forwards-backwards modulations, and for conditions in which users either moved through the VE or were stationary in the VE while viewpoint was modulated. These results shed light on which features of visual self-motion stimuli drive postural responses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.