Researchers from NVIDIA, Adobe, and Stony Brook University developed a system that allows VR users to explore large virtual worlds in small physical spaces, enabling them to avoid walls, furniture, or other players.
The research, first presented at the GPU Technology Conference (GTC) in San Jose earlier this year and featured in an NVIDIA blog, works by tracking rapid eye movements called saccades.
The main contributions of this work are, “an end-to-end redirected walking system based on saccadic suppression, effective for consumer room-scale VR; and a real-time path planning algorithm that automatically avoids static and dynamic obstacles by responding to individuals’ eye movements − our optimization links user behavior and physical changes, considers possibilities of near future through real-time sampling, and finds the best numerical solution for online camera manipulation.”
Since GTC, the team published a new video that describes the method — and today they released a research paper that includes more details about the technique.
The system can be used in VR interactive applications, games, and experiences with any VR-capable GPU.
The paper will be presented at SIGGRAPH 2018 in Vancouver, British Columbia.
Read more >