Wearable Eye Tracking

Peter Milford


Please LOG IN to view the video.

Date: April 29, 2015


Eye interaction technology can be applied to wearable computing systems, ranging from augmented reality systems to virtual reality to information display devices. Applications of eye interaction range from eye tracking, user interface control, iris recognition, foveated rendering, biometric data capture and many others. Researchers have been working on eye tracking since the 1800’s, starting with ‘observations’ to photography to direct contact and electrical methods to current camera based methods. I will outline eye tracking in general, with focus on wearable eye tracking and the applications.

Further Information:

Peter received his Ph.D in astrophysics at the University of Queensland, Brisbane Australia. He worked for 5 years at Stanford on a satellite based Solar observing experiment – observing a ‘large spherical object’. He left Stanford to join a startup developing a three degree of freedom magnetic tracker, with applications in virtual reality head tracking. He went on to start his consulting company, working with a variety of Silicon Valley firms, mainly in consumer electronics industry, bringing a practical physics approach to embedded sensors, imaging, calibration, factory test, algorithms etc. Peter has been working with Eyefluence Inc. since it’s founding and is CTO/VP Engineering overseeing a strong multi-disciplinary team in developing wearable eye interaction technology. He now looks at ‘small spherical’ objects. Eyefluence Inc. goal is to transform intent into action through your eyes. Eyefluence is developing a variety of eye interaction technologies for upcoming wearable display systems, including eye tracking, iris recognition and user interfaces for control of HMDs.

Created: Thursday, April 30th, 2015