Co-Optimizing Human-System Performance in XR

Qi Sun


Please LOG IN to view the video.

Date: October 12, 2022


Extended Reality (XR) enables unprecedented possibilities for displaying virtual content, sensing physical surroundings, and tracking human behaviors with high fidelity. However, we still haven’t created “superhumans” who can outperform what we are in physical reality, nor a “perfect” XR system that delivers infinite battery life or realistic sensation. In this talk, I will discuss some of our recent research on leveraging eye/muscular sensing and learning to model our perception, reaction, and sensation in virtual environments. Based on the knowledge, we create just-in-time visual content that jointly optimizes human (such as reaction speed to events) and system performance in XR.

Further Information:

Qi Sun is an assistant professor at New York University. Before joining NYU, he was a research scientist at Adobe Research. He received his PhD at Stony Brook University. His research interests lie in computer graphics, VR/AR, computational cognition, and human-computer interaction. He is a recipient of the IEEE Virtual Reality Best Dissertation Award, as well as ACM SIGGRAPH Best Paper Award.

Created: Saturday, October 15th, 2022