Eye-Tracking Data Acquisition SystemRochester Institute of Technology
Department of Computer Engineering
Senior Design Project - Winter & Spring 2009-2010
Piyush Agarwal, Robert Laiacona, Zachary Harvey, Lowren Lawson
Our goal is to expand the functional use of the eye-tracking head gear used by the RIT Imaging Sciences department. The head gear consists of a pair of glasses with two cameras and an IR LED. One camera is pointed at the subject and is used to capture the eye movement. The other camera faces the away from the subject capturing the scene being viewed from the subject's perspective. In the existing system, post processing software maps the eye position to a particular location on the scene. The reflection of the IR LED along with the position of the subject's pupil are reference points used to assist in the tracking of the eye movement. After locating the subject's focal point in the scene, the two video streams are merged side by side into one video that is compressed and stored to disk. The intent of this project is to eliminate the need for post processing. By using a new efficient algorithm to track eye movement, then capturing the associated eye and scene videos in real time, the time wasted by the existing systems in post processing can be elimated.