Exploring and Modeling Directional Effects on Steering Behavior in Virtual Reality

Uložené v:
Podrobná bibliografia
Názov: Exploring and Modeling Directional Effects on Steering Behavior in Virtual Reality
Autori: Yushi Wei, Kemu Xu, Yue Li, Lingyun Yu, Hai-Ning Liang
Zdroj: IEEE Transactions on Visualization and Computer Graphics. 30:7107-7117
Informácie o vydavateľovi: Institute of Electrical and Electronics Engineers (IEEE), 2024.
Rok vydania: 2024
Predmety: Predictive models, Mathematical models, Barehand interaction, Muscles, Solid modeling, Steering law, Performance evaluation, Three-dimensional displays, Computational modeling, Head-mounted display, Virtual reality, Human performance modeling
Popis: Steering is a fundamental task in interactive Virtual Reality (VR) systems. Prior work has demonstrated that movement direction can significantly influence user behavior in the steering task, and different interactive environments (VEs) can lead to various behavioral patterns, such as tablets and PCs. However, its impact on VR environments remains unexplored. Given the widespread use of steering tasks in VEs, including menu adjustment and object manipulation, this work seeks to understand and model the directional effect with a focus on barehand interaction, which is typical in VEs. This paper presents the results of two studies. The first study was conducted to collect behavioral data with four categories: movement time, average movement speed, success rate, and reenter times. According to the results, we examined the effect of movement direction and built the SθModel. We then empirically evaluated the model through the data collected from the first study. The results proved that our proposed model achieved the best performance across all the metrics (r2 > 0.95), with more than 15% improvement over the original Steering Law in terms of prediction accuracy. Next, we further validated the SθModel by another study with the change of device and steering direction. Consistent with previous assessments, the model continues to exhibit optimal performance in both predicting movement time and speed. Finally, based on the results, we formulated design recommendations for steering tasks in VEs to enhance user experience and interaction efficiency.
Druh dokumentu: Article
ISSN: 2160-9306
1077-2626
DOI: 10.1109/tvcg.2024.3456166
DOI: 10.1109/tvcg.2024.3456147
Prístupová URL adresa: https://pubmed.ncbi.nlm.nih.gov/39255122
Rights: IEEE Copyright
Prístupové číslo: edsair.doi.dedup.....fdcf58b88bc7c803f68c6f456f5e57d0
Databáza: OpenAIRE
Popis
Abstrakt:Steering is a fundamental task in interactive Virtual Reality (VR) systems. Prior work has demonstrated that movement direction can significantly influence user behavior in the steering task, and different interactive environments (VEs) can lead to various behavioral patterns, such as tablets and PCs. However, its impact on VR environments remains unexplored. Given the widespread use of steering tasks in VEs, including menu adjustment and object manipulation, this work seeks to understand and model the directional effect with a focus on barehand interaction, which is typical in VEs. This paper presents the results of two studies. The first study was conducted to collect behavioral data with four categories: movement time, average movement speed, success rate, and reenter times. According to the results, we examined the effect of movement direction and built the SθModel. We then empirically evaluated the model through the data collected from the first study. The results proved that our proposed model achieved the best performance across all the metrics (r2 > 0.95), with more than 15% improvement over the original Steering Law in terms of prediction accuracy. Next, we further validated the SθModel by another study with the change of device and steering direction. Consistent with previous assessments, the model continues to exhibit optimal performance in both predicting movement time and speed. Finally, based on the results, we formulated design recommendations for steering tasks in VEs to enhance user experience and interaction efficiency.
ISSN:21609306
10772626
DOI:10.1109/tvcg.2024.3456166