EyeT Guys

Team Name

EyeT Guys

Timeline

Spring 2019 – Summer 2019

Students

  • Jonathan El-Khoury
  • George Hinkel
  • Dharampreet Gill
  • Jay Denton
  • Paul Wafula

Sponsor

Marcie Gibson, The Gibson’s, Chris McMurrough

Abstract

Many of the current eye tracking products created for ALS patients do not work well for everybody. Our goal is to create one that will. Our inspiration is, local UTA alum, Marcie Gibson, who was diagnosed with ALS the year she graduated college. Our primary objective is to create a product that works well for her before her current outdated machine gives out. Once perfected for Marcie, we would like to bring our product to anybody else who finds themselves in similar predicaments with old or unreliable eye tracking technologies.

In order to achieve our vision, we will design a convenient Eye Tracking Remote and get rid of the old and heavy eye tracking headset. A more reliable pupil tracking system will be implemented on an open source operating system to enhance the longevity of the system from independence. Eye command features will also be added to diversify the functionality of the system. Furthermore, to compatible with the wheelchair of the ALS patients, the Eye Tracking Remote is attached to an additional screen along with a fancy Eye-Type software and supported by an external battery. The main goal for us is to create a safety-first product that will last the user’s lifetime.

Background

Imagine being trapped in your own body, unable to move a muscle or even talk, all you can do is look around… this may remind some of you of sleep paralysis, but there are thousands of patients suffering from ALS and other locked-in diagnoses living this reality on a daily basis. However, there is hope in technology to assist these people in communicating and living a full life. For many decades now, computers, digital cameras, and certain image processing techniques have enabled patients with locked-in syndrome to communicate via typing by tracking their eyes as they look at a digital keyboard.

The problem is that the market for medical eye tracking and eye typing is very small, limited to the patient population of those with various locked-in diagnoses, which has led to many older smaller manufacturers of eye typing systems being bought out by bigger companies, leaving fewer choices for the end consumer of these medical products.

Marci is a former UTA Alumnus and the product owner. She has been living with ALS for years now and needs an eye tracker and typing system to communicate. She has been using the same system for years since it was reliable and she was skilled at using it, however the manufacturer was bought out and support for the device was dropped, so they are in need of a new system as their current eye typing system decays without software or hardware support. They tried many of the newest versions of these sorts of machines but found them all to be less effective than the old machine, which led them to asking the UTA Computer Science and Engineering department who assigned two Senior Design teams to the task. Our team, EyeTGuys, was tasked with the creation of the hardware eye tracking system consisting of remote cameras, and a machine to run the image processing. Our sister team, EyeType was tasking with creating the accessibility tool and keyboard app extension.

Project Requirements

  1. Eye Tracking
  2. Eye Tracking Performance -> low latency
  3. Eye Tracking Performance -> high throughput
  4. Camera
  5. Daylight usability
  6. Interchangeable long-lasting battery
  7. Portability
  8. Non-interference with ventilator

System Overview

For our system, there will be two modules that interact, the EyeTGuys module, which our team is responsible for, and the EyeType module, which the EyeType team is responsible for. Overall the system will perform the function of a medical grade eye-tracking and eye-typing accessibility communication device.

The EyeTGuys module will consist of six hardware and four software components. Hardware includes the microcontroller/machine that will do the computational workload and it’s power supply, two infrared lens filtered cameras, one infrared light, one character LCD display, and a physical casing and mount structure. Software includes an eye tracker, an eye gesture recognizer, a gaze-coordinate converter, and a state machine with calibration routine. The cameras will be pointed at and focused the user’s individual eyes, while the infrared light will be pointed at the face in order to illuminate the eye images without bothering the user with a light in their eye. They will constantly be taking real time video and streaming that data to the microcontroller, which will be performing real time image processing within the eye tracking program, using the images of the eyes to estimate a “gaze vector”. If the microcontroller is in the calibrated state, the machine will feed that gaze vector data into the gaze-coordinate converter which, based on a previous calibration session’s data, estimates the screen coordinate that corresponds with the observed gaze vector and then streams that data to the other module for use in the eye-typing program. The machine will also always stream the gaze vector data to the eye-gesture recognizer which will use patterns of relative changes in the gaze vector to enable basic control functionality, like ordering a calibration or calling for assistance, without the need for accurate calibration data. Finally, when in the uncalibrated state or when an outside event triggers a calibration order, the machine enters a calibration routine which receives screen coordinate data from the other module as it displays a calibration screen and correlates it with the system’s observed gaze vector.

The EyeType module will be designed and constructed by the EyeType team, but it will, in its most basic form, include a computer or microcontroller capable of running a consumer OS, and a screen/display that will act as the display for the computer while also being the screen which the EyeTGuys module is supposed to return the gaze screen coordinates.

Results

Future Work

Future work text goes here

Project Files

Project Charter (link)

System Requirements Specification (link)

Architectural Design Specification (link)

Detailed Design Specification (link)

Poster (link)

References

OpenCV, OpenFace, github

chenc4