Research ->Projects, Publications, Software & Datasets, People

Software & Datasets-> Eye Gaze Guided Interface Code (interface code, real time eye movement classification algorithms)

Eye Gaze Guided Interface Code (interface code, real time eye movement classification algorithms)

This distribution contains the code for an eye gaze guided photo viewing application (iGaze).

The code can be used as a test bed to create any types of the eye gaze guided interfaces in Windows to test ideas related to usability such as widget layout, size, visual feedback etc. The implemented functionality is not limited to photo browsing and can be extended to other examples of computer use where primary or an auxiliary input is done by the eye movements.

The iGaze contains an implementation of two real time eye movement classification algorithms Velocity Threshold Identification (I-VT) and Kalman Filter Identification (I-KF). Both algorithms are implemented under the umbrella of  the Real Time Eye Movement Identification (REMI) protocol. Implemented in iGaze real time eye movement classification capabilities allow to test various research ideas that relate to the accuracy, stability, and jitter of the eye-gaze input.

 The release of the iGaze code summarizes the several year of work done at the HCI lab at Texas State University. This release summarizes the work presented in the following papers:

1) Calibration Accuracy and Resulting Interface Component Placement

O. V. Komogortsev, C. Holland, and J. Camou, Adaptive eye-gaze-guided interfaces: design and performance evaluation, In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI), 2011, pp. 1255-1260. [link]

2) Real Time Eye Movement Identification (REMI) protocol to facilitate real time eye movement classification

D. H. Koh, M. Gowda, and O. V. Komogortsev, Real Time Eye Movement Identification Protocol, In Proceedings of ACM Conference on Human Factors in Computing Systems (CHI), Atlanta, GA, 2010, pp. 1-6. [.pdf]

3) Implementation and performance testing of real time eye movement classification algorithms. General recommendations on the interface component size, layout, and feedback.

D. H. Koh, S. A. M. Gowda, and O. V. Komogortsev, Input evaluation of an Eye-Gaze-Guided interface: Kalman filter vs. Velocity Threshold Eye Movement Identification, In Proceedings of ACM Symposium on Engineering Interactive Computing Systems, Pittsburgh, PA, USA, 2009, pp. 197-202. [.pdf]

We have decided to make the software available to the eye-tracking research community free of charge. If you use this software in your research please reference one of the papers mentioned above depending on the aspect of the implementation that you use.

Two types of eye trackers were employed to work with this software a) any non-wearable Tobii eye tracker (code was tested with x120 system) b) non-commercial web-camera-based ITU Gaze Tracker. This is an eye tracker which an open source implementation. The executable for Gazetracker 2.0 beta is included with distribution for convenience and with permission from ITU group. It is possible to connect other types of eye trackers to the iGaze, however in this case the code should be modified.

Download iGaze source code here. If you download the software it is assumed that you agree to the copyright agreement at the bottom of the page.

The compressed files have been password protected. Please email Dr. Oleg Komogortsev for the password. Kindly indicate your university/industry affiliation and a brief description of how you plan to use the software. Please use words "iGaze software" in the subject line.

Acknowledgment: special thanks are expressed to Mr. Corey Holland for the optimizations, bug fixes, and GUI included in the software. Currently this project is funded in part by the NSF CAREER award #CNS-1250718, in part by the #60NANB12D234 grant from the National Institute of Standards and funds from Texas State University. In the past this project was funded in part by the grant #60NANB10D213 from the National Institute of Standards.

COPYRIGHT NOTICE STARTS HERE--------------------------------------------------------------

Copyright © 2014 The Texas State University
All rights reserved.

Permission is hereby granted, without written agreement and without license or royalty fees, to use, copy, modify, and distribute this software and its documentation for any purpose, provided that the copyright notice in its entirety appear in all copies of this software, and the original source of this software, Human Computer Interaction Laboratory at the Texas State University, is acknowledged in any publication that reports research using this software. The software is to be cited in the bibliography as:

    O. V. Komogortsev, C. Holland, and J. Camou, Adaptive Eye-gaze-guided Interfaces: Design and Performance Evaluation, In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI), Vancouver, BC, Canada, 2011, pp. 1-6.

or

    D. H. Koh, M. Gowda, and O. V. Komogortsev, Real Time Eye Movement Identification Protocol, In Proceedings of ACM Conference on Human Factors in Computing Systems (CHI), Atlanta, GA, 2010, pp. 1-6.

or

    D. H. Koh, S. A. M. Gowda, and O. V. Komogortsev, Input evaluation of an Eye-Gaze-Guided interface: Kalman filter vs. Velocity Threshold Eye Movement Identification, In Proceedings of ACM Symposium on Engineering Interactive Computing Systems, Pittsburgh, PA, USA, 2009, pp. 197-202.

IN NO EVENT SHALL THE TEXAS STATE UNIVERSITY BE LIABLE TO ANY PARTY FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OF THIS SOFTWARE AND ITS DOCUMENTATION, EVEN IF THE TEXAS STATE UNIVERSITY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

THE TEXAS STATE UNIVERSITY SPECIFICALLY DISCLAIMS ANY WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE SOFTWARE PROVIDED ON AN "AS IS" BASIS, AND THE TEXAS STATE UNIVERSITY HAS NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR MODIFICATIONS.

COPYRIGHT NOTICE ENDS HERE-----------------------------------------------------------------