Privacy Preserving Expression Recognition for VR

Team Name

VR EMG Team

Timeline

Fall 2022 – Spring 2023

Students

  • Michael Hopper
  • Luke Howard
  • Cody Reynolds
  • Jane Cho
  • Nicholas Parrill

Sponsor

Dr. VP Nguyen

Abstract

A VR Mask Add-On that reads muscle data from the face and translates it into visible data.

Background

Detecting a person’s facial expression is important for a number of reasons, but current methods for observing and analyzing facial expressions have limitations both in efficacy and privacy. Cameras are the current leading method of tracking expression, but can be thrown off by objects on the face, especially VR headsets. They also require the user to be constantly looking directly into a camera for them to operate effectively, which can cause concern for users.


Our sponsor, Dr. VP Nguyen, wishes for us to use other methods of sensing facial expressions that integrate well with VR headsets. The main focus of this project is furthering technology for detecting expressions, and changes in expressions, as it is very difficult to do so. Then, use the developed technology to render the corresponding emotion to show how the sensor can be applied to software. His major concerns revolve around comfort and preserving the user’s privacy. Once the project is complete, we will have created a very general suite of sensors for detecting expressions that can be applied to problems more broad than just displaying the face.


The sensors we are working with consist primarily of Electromyography (EMG) sensors, which detect muscle contractions by the electrical signals muscles emit. Furthermore we will use Inertial Measurement Units (IMU), which capture acceleration and rotational data. We may add more sensors to this collection in the future, but we will not be using image or audio recording devices at any point to protect the user’s privacy.


Dr. Nguyen provided two immediate applications for when the project is complete: Augmented/Virtual reality environments, and therapy. In digital environments, virtual avatars don’t emote along with the user. Reflecting the user’s expression can be used to create more lifelike, personal experiences. For usage in therapy, the sensors could be used as a method for monitoring dementia. A common sign of dementia’s onset is mild cognitive impairment, where a person’s normal facial expressions change. Detecting these changes could allow physicians or researchers to treat dementia at an earlier stage, possibly allowing for much better treatments.

Project Requirements

  • Capture Three Facial Expressions – Neutral, smiling, and scowling
  • OpenBCI Cyton Board – Used for the EMG data collection
  • MetaMotionRL IMU Sensors – Used for Acc/Gyro/Quaternion data collection
  • No Audio/Video Recording Devices – Secure user privacy
  • Must Fit within Vive Facemask – Constraint to build sensors into where the HTC Vive touches the face (around the eyes/nose)
  • Processing/Rendering Speed – High enough for real time viewing and classification

System Overview

The VR + EMG Muscle to VR Transposition system uses a linear progression of data to produce a result. The current planned path of data through the system is for it to be extracted as input, transformed and processed, and then displayed as output without deviation from this overarching path. The sensors will collect the data from a person, machine learning will convert the raw data into an understandable classification, and then a renderer will display the interpretation to a screen to be seen by a person.

Results

First HTC Vive EMG/Gyro Testing Interface

HTC Vive Facial Interface – 3D CAD Design

HTC Vive Sensor Interface – 3D CAD Design

Final HTC Vive Sensor Interface – Printed from Forms Printer with Elastic 50A Material

First Version – Python Program Data Collection

Second Version – Python Program with SVM Classifier

Final Version – Full Stack Program

Future Work

No future plans as of this moment.

Project Files

Project Charter Charter.pdf

System Requirements Specification SRS.pdf

Architectural Design Specification ADS.pdf

Detailed Design Specification DDS.pdf

Poster DemoDay.ppt

References

https://shop.openbci.com/products/cyton-biosensing-board-8-channel
https://mbientlab.com/metamotionrl/

ctr0606