Measuring “reality” into virtual reality

You are making a Zoom call when suddenly the audio is behind the video. Your partner’s lips move, but it looks like a folded film, a minor inconvenience. However, this is detrimental to scientific experiments using virtual reality (VR).

For example, if you are a researcher who uses VR to reproduce the cognitive benefits of exercise for bedridden patients, the lag in visual and audio stimuli can sabotage your data and results.

Thus, as the use of VR in human behavior studies increases, there is a growing need for visual and auditory stimuli that are presented with millisecond accuracy and precision.

A Tohoku University research team has measured the accuracy and precision of visual and auditory stimuli in modern VR head mount (HMD) displays that use the Python programming language.

Details of his research were published in the journal Behavior Research Methods on August 3, 2021.

“Most standard methods in laboratory studies are not optimized for virtual reality environments,” said Ryo Tachibana, co-author of the paper. “Instead of specialized software that allows for greater experimental control, most VR studios use Unity or Unreal Engine, which are 3D game engines.”

Establishing more appropriate VR environments, where researchers have the flexibility to control and adjust them according to their experiments, would yield more reliable results.

Tachibana and Kazumichi Matsumiya took advantage of the latest Python tools for virtual reality experiments and a special measuring device for stimulus synchronization known as the Black Box Tool Kit.

Experiment settings to evaluate the accuracy and precision of VR HMD. © Ryo Tachibana

They recorded 18 millisecond (ms) lags for visual stimulation in modern HMD VR. For the auditory stimulus, they observed a delay of between 40 and 60 ms, depending on the HMD. Jitter, the standard delay deviation, was recorded at 1 ms for visual stimulus and 4 ms for auditory stimulus. All results were consistent in both Python 2 and 3 environments.

Tachibana notes that to date there has been virtually no empirical data assessing the accuracy and precision of virtual reality environments, although its adoption in behavioral research has proliferated.

“We believe our study benefits researchers and developers who apply VR technology, as well as studies on rehabilitation tools that require high temporal accuracy to record biological data,” Tachibana added.

Post details:

Title: Accuracy and precision of the presentation of visual and auditory stimuli in virtual reality in Python 2 and 3 environments for the investigation of human behavior.

Authors: Ryo Tachibana and Kazumichi Matsumiya

Journal: Behavior Research Methods

DOI: 10.3758 / s13428-021-01663-w

/ Public publication. This material comes from the organization of origin and can have a punctual character, edited for more clarity, style and length. See completely here.

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *