Publications
All But Static: Exploring Dynamic Eye Dominance for Foveated Rendering
Foveated rendering is a promising technology in virtual and augmented reality (VR/AR), leveraging eye tracking to optimize computational load by rendering high-quality imagery only in the user’s gaze direction. Recent advancements have explored using eye dominance, a user’s subconscious preference for one eye, to enhance efficiency. So far, methods have treated eye dominance as a fixed trait. Evidence from vision science and psychology suggests it is dynamic and context-dependent. This paper reviews existing research on eye dominance behaviour and its variability, highlighting its implications for VR/AR rendering. Building on these insights, we propose refinements to foveated rendering, incorporating calibration and real-time prediction mechanisms to account for eye dominance variability. By embracing the dynamic nature of eye dominance, these advancements aim to optimize computational performance while maintaining a seamless and personalized user experience in VR/AR applications.
Advancing Eye Dominance Testing: Comparing Traditional Methods with an In-HMD Approach for AR/VR Applications
Eye dominance, the subconscious preference for one eye, is an increasingly critical factor in optimizing user experiences in virtual and augmented reality (AR/VR) applications. However, traditional methods for determining eye dominance often produce inconsistent results, creating challenges for calibration and the development of reliable eye-dominance-based technologies. To address this, we investigated eye dominance by comparing three traditional tests with a novel in-HMD test, designed specifically for immersive AR/VR settings. Our findings revealed that while traditional methods of- ten produced inconsistent results, the in-HMD test demonstrated greater alignment with established methods. Based on these results, we provide recommendations on testing eye dominance in future research to improve reliability. Furthermore, the in-HMD test demonstrates strong potential as a practical and valid alternative for AR/VR applications, offering a streamlined approach for calibration and advancing the development of eye-dominance-based technologies.
Dynamics of Eye Dominance Behavior in Virtual Reality
Prior research has shown that sighting eye dominance is a dynamic behavior and dependent on horizontal viewing angle. Virtual reality (VR) offers high flexibility and control for studying eye movement and human behavior, yet eye dominance has not been given significant attention within this domain. In this work, we replicate Khan and Crawford’s (2001) original study in VR to confirm their findings within this specific context. Additionally, this study extends its scope to study alignment with objects presented at greater depth in the visual field. Our results align with previous results, remaining consistent when targets are presented at greater distances in the virtual scene. Using greater target distances presents opportunities to investigate alignment with objects at varying depths, providing greater flexibility for the design of methods that infer eye dominance from interaction in VR.
It’s Not Always the Same Eye That Dominates: Effects of Viewing Angle, Handedness and Eye Movement in 3D
Understanding eye dominance, the subconscious preference for one eye, has significant implications for 3D user interfaces in VR and AR, particularly in interface design and rendering. Although HCI recognizes eye dominance, little is known about what causes it to switch from one eye to another. To explore this, we studied eye dominance in VR, where 28 participants manually aligned a cursor with a distant target across three tasks.We manipulated the horizontal viewing angle, the hand used for alignment, and eye movement induced by target behaviour. Our results confirm the dynamic nature of eye dominance, though with fewer switches than expected and varying influences across tasks. This highlights the need for adaptive HCI techniques, which account for shifts in eye dominance in system design, such as gaze-based interaction, visual design, or rendering, and can improve accuracy, usability, and experience.
Robotic Task Complexity and Collaborative Behavior of Children with ASD
Social interactions are essential in the everyday lives of humans. People with an autism spectrum disorder (ASD) display shortages of social skills, thus making their day-to-day encounters more difficult. This paper reports on two small-scale studies, investigating whether the use of collaborative robot tasks in an educational setting stimulates the collaborative behavior of children with ASD, and whether robotic task complexity affects collaborative behavior. A total of 24 children participated in robotic tasks of varying complexities. The sessions were videotaped and analyzed. Children’s supervisors completed questionnaires, evaluating the social behavior of participants. Results demonstrate that children collaborated during the robot activities. The influence of robotic task complexity on collaboration skills was not significant, possibly due to the small number of participants. The results show the promise of using robots in education for children with ASD, although further research is needed to investigate the implementation of robots in special education.