NDA

Team Name

NDA

Timeline

Spring 2020 – Summer 2020

Students

  • Edgar Gonzalez
  • Robert LeBlanc
  • Luis Martinez
  • Logan Plymale
  • Meghan Tennant

Sponsor

Raytheon

Abstract

As part of an on going project, we are working with a piece of hardware called a “Sensorium” board that can pick up RF signal data like Wi-Fi, Bluetooth and environmental data such as temperature, pressure and humidity. We are using Amazon Web Services as a cloud infrastructure to host, store and transmit the data from the Sensorium to the Microsoft Hololens. Also, machine learning will use this data to make predictions, such as weather or critical system failures. The augmented reality overlay display in a Hololens will pull the sensor data from AWS and visualize that data in in real time. 

Background

The current business use case of this project is to be able to monitor RF signals and create appropriate responses to the signals received. Ideally these signals will help to give information on the current status of various business systems that are currently in operation, making it an excellent monitoring system to keep track of business operations. Furthermore, there are additional use cases which can be found for the data collected such as rendering RF signals in augmented reality, which could be used to analyze a broken system and find a solution to fix it. The current customer for this project is Raytheon. They have been the sponsor of this project for more than years now and want us to work on this technology to help them improve their monitoring systems within the company itself. There are currently existing relations between the customer and past development teams that will be connected to the current team working on this assigned project. Furthermore, there is another team assigned to this project for the next year. The other electrical engineering team that is continuing development on the sensor boards that will sniff out the RF signals and transmit this data up to our AWS database, which is where the HoloLens will pull data from to display to the user.

Project Requirements

  1. Retrieve data collected from the Sensorium board after it is after it is stored in the AWS Cloud database.
  2. Build an AR overlay in unity that will display buttons and sprites that the user can interact with inside the HoloLens.
  3. Have the sprites communicate data collected from the sensors.
  4. Use machine learning to make predictions such as weather and critical system failures.
  5. Add more data interpretations.
  6. Notify the user if there is a new audio file to listen to and let the user choose a specific audio file in the Hololens.
  7. Use Grafana to visualize data.
  8. Have the ability for the user to specify what signal/frequency they want to listen to inside the Hololens.
  9. Run a live test.
  10. Get Unity project to work with the HoloLens 2 Emulator.

System Overview

The sensorium project is made up of four major systems that define the architecture. These systems are, what we call them, Sensoriums, AWS, AR, and Grafana. Essentially, the Sensorium system is what generates all of the data for all of the systems to consume. This system is responsible for the collection of data in real-time via multiple sensors, data such as Temperature and Air Humidity is collected. Every system is able to access the data collected by the Sensorium system, by using the AWS system. This system is responsible for the storage and computational efforts of all of the sensorium data. Finally, once all of the data has been collected and processed this data is consumed by the AR system. This system is responsible for interpreting the collected data and displaying it to the user in a meaningful way such that it is clear what the information collected means and represents via Augmented Reality. The Grafana system is also a consumer of the collected data but its purpose is to make sense of the data and visualize it with meaningful charts and graphs. 

System Overview Diagram

Results

  • All Sensorium boards appear in the Hololens application and their data can be seen in the Heads-Up Display (HUD).
  • All  sensors, besides Reflectivity, are operational and correct data is being sent to the AWS database.
  • BeagleBones are booting from the SD card.
  • Docker is running on the BeagleBone.

Microsoft Hololens Heads-Up Display Demo

Future Work

  1. Finish up the work on Software Defined Radio for RF signals.
  2. Incorporate Machine learning algorithms
  3. Create visuals for WiFi connections
  4. Test sensoriums in defined environments.
  5. Add more data interpretation to the Hololens application.

Project Files

Project Charter (link)

System Requirements Specification (link)

Architectural Design Specification (link)

Detailed Design Specification (link)

Poster (link)

gustijl