Virtual Reality (VR) allows users to perform natural movements such as hand movements, turning the head and natural walking in virtual environments. While such movements enable seamless natural interaction, they come with the need for a large tracking space, particularly in the case of walking. To optimise use of the available physical space, prediction models for upcoming behavior are helpful. In this study, we examined whether a user’s eye movements tracked by current VR hardware can improve such predictions. Eighteen participants walked through a virtual environment while performing different tasks, including walking in curved paths, avoiding or approaching objects, and conducting a search. The recorded position, orientation and eye-tracking features from 2.5 s segments of the data were used to train an LSTM model to predict the user’s position 2.5 s into the future. We found that future positions can be predicted with an average error of 65 cm. The benefit of eye movement data depended on the task and environment. In particular, situations with changes in walking speed benefited from the inclusion of eye data. We conclude that a model utilizing eye tracking data can improve VR applications in which path predictions are helpful.
Titelaufnahme
Titelaufnahme
- TitelEye Tracking-based LSTM for Locomotion Prediction in VR
- Verfasser
- Erschienen
- AnmerkungFörderer: Deutsche Forschungsgemeinschaft / Projektnummer: 274361309Funding organisation: Deutsche Forschungsgemeinschaft / Project number: 274361309Förderer: European Commission / Projektnummer: 951910Funding organisation: European Commission / Project number: 951910
- SpracheEnglisch
- Bibl. Referenz2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2022, 493-503
- DokumenttypAufsatz in einer Zeitschrift
- Schlagwörter (EN)
- URN
- DOI
Zugriffsbeschränkung
- Das Dokument ist frei verfügbar
Links
- Social MediaShare
- Nachweis
- IIIF
Dateien
Klassifikation
Abstract
Inhalt
Statistik
- Das PDF-Dokument wurde 6 mal heruntergeladen.