Taking a binocular view of augmented reality system design

Emily Cooper

(University of California at Berkeley)


Please LOG IN to view the video.

Date: December 7, 2022

Description:

Augmented reality (AR) systems aim to enhance our view of the world and make us see things that are not actually there. But building an AR system that effectively integrates with our natural visual experience is hard. AR systems often suffer from technical and visual limitations, such as small eyeboxes and narrow visual-field coverage. An integral part of AR system development, therefore, is perceptual research that improves our understanding of when and why these limitations matter. I will describe the results of perceptual studies designed to provide guidance on how to optimize the limited visual field coverage supported by many AR systems. Our analysis highlights the idiosyncrasies of how our natural binocular visual field is formed, the complexities of quantifying visual field coverage for binocular AR systems, and the trade offs that are necessary when an AR system can only augment a subarea of the visual field.

Further Information:

Emily Cooper is an Assistant Professor of Optometry and Vision Science at the University of California, Berkeley. Her lab’s research examines the mechanisms and phenomenology of human visual perception, with a particular emphasis on perception of three-dimensional (3D) space. In addition to developing insights into basic 3D vision, her lab works to apply these scientific insights to make perceptually meaningful improvements to augmented reality systems.




Created: Sunday, December 11th, 2022