Team Name
EyeType
Timeline
Spring 2019 – Summer 2019
Students
- Matthew Crum
- Marcus Maldonado
- Raghad Safauldeen
- Shaili Balampaki
- Tonytam Dinh
Sponsor
UTA/Gibson family
Abstract
Many of the current eye tracking products created for ALS patients do not work well for everybody. Our goal is to create one that will. Our inspiration is, local UTA alum, Marcie Gibson, who was diagnosed with ALS the year she graduated college. Our primary objective is to create a product that works well for her before her current outdated machine gives out. Once perfected for Marcie, we would like to bring our product to anybody else who finds themselves in similar predicaments with old or unreliable eye tracking technologies.
In order to achieve our vision, we will be creating a more modern, user friendly, and reliable eye tracking software than the ones currently in market. We are going to create a GUI software that is innovative, easy to use, and very customizable. It will include features that allow the user to communicate, surf the internet, use applications, control his or her environment, and calibrate the machine with simple eye gestures. A heavy focus for us is to create a consumer-first product that will last the user’s lifetime.
Background
Marcie Gibson graduated UTA in 1994, and was soon after diagnosed with ALS. Now in her 40’s, she is at the worst stages of ALS and no longer has motor functions with the exception of her eyes. Despite these circumstances, Marcie is very sharp and works as a paralegal for a lawyer.
She currently uses a remote eye tracking system to communicate built by Eye Response Technologies called Erica. Erica works by finding her pupils, creating a gaze vector, and mapping this vector to pupil coordinates on a monitor. This allows her to move a mouse cursor on a computer system with an on-screen keyboard, which she then can select letters by dwelling on them for a certain amount of time. After completing her desired message, the computer will convert her text to speech. Marcie can communicate fairly quickly since she is very agile with her eyes and has a short dwell time.
The current system works well for her, but is very antiquated. It runs on Windows XP, experiences random crashes, and doesn’t work outdoors. Another major problem is the lack of support for pupil gestures. Remote eye tracking systems need to be constantly calibrated for proper performance. If Marcie wakes up in the middle of the night, she has no way to signal if she needs help since the system needs re-calibration. Pupil gestures would allow her to wake up the system for re-calibration, or send a signal to her parents that she needs help. Also, her system has no support for customization which would be extremely helpful for tailoring an experience specifically for her to both increase communication speed and optimally reduce eyestrain.
As it stands, Erica is destined to fail Marcie. Her family cannot simply upgrade her system because a Swedish Company called Tobii bought out Eye Response Technologies and killed off Erica. Marcie has tried the newer systems from Tobii, but they do not work as well for her. Additionally, other outside options are limited since not much medical research money is poured into remote eye tracking systems because the ALS population is relatively small.
Project Requirements
Customer Requirements includes:
- Camera
- Control of Environment
- Multiple Screens
- Keyboard Format
Packaging Requirements includes:
- Pre-Loaded System
Performance Requirements includes:
- Accurate Mapping
- Sustainability
Maintenance & Support Requirements includes:
- User Manual
Other Requirements includes:
- Customization
- Glare Resistant
System Overview
We will be creating an eye tracker that is designed to work for Marcie specifically, but will also be easy for anybody to calibrate and use as well. Marcie’s machine could give out at any moment, so first and foremost, we are going to focus on implementing the ability to communicate.
Our team is working on the software components, so we will be creating a communication GUI with a keyboard that is clean and ergonomic. We will be mapping the pupil coordinates, provided by the hardware team’s system, to the GUI so that the user can communicate by dwelling his/her gaze on a key of the virtual keyboard. This will allow the user to type like on a normal keyboard, however, there will also be a button to read the text aloud using text-to-speech software. The user will be able to access the keyboard from a normal desktop screen, where they will be able to use their gaze as the mouse and will be able to navigate the computer normally, even surf the web.
After we get the communication down, we will implement a feature to allow the user to start calibration or call for assistance using eye gestures, so that when the user is alone, they can still communicate.
Finally, we will include a separate GUI that functions as a remote control, allowing the user to control things in her environment, such as a tv or lights.
Results
Results text and demo videos go here
Future Work
Future work text goes here
Project Files
Project Charter (link)
System Requirements Specification (link)
Architectural Design Specification (link)
References
Any references go here, properly formatted