Team Name
ViRal
Timeline
Fall 2018 – Spring 2019
Students
- Luke Hardin
- Dawsen Richins
- Anna Cox
- Jake Nissley
- Saurya Bhattarai
Sponsor
Ratheon
Abstract
Background
Project Requirements
- The system shall be a Virtual Reality environment with accurate model of the University of Texas at Arlington.
- The system shall utilize only data detected by the sensorium on the drone’s flight path.
- The system shall display each wireless signal as a cluster of data points in a heatmap color scale based on the relative strength.
- The system shall display the origin point of each signal determined by a trilateration algorithm.
- The system shall map each signal to the accurate height and GPS coordinates.
- The user shall be able to click on signals to learn more information about that signal.
- The user shall be able to maneuver around the VR environment.
-
The system shall update the signal data offline based on data collected during latest flight session of the drone.
- The system shall use Amazon Web Services (AWS) for raw data processing and storage.
- The system shall run on Windows 10 PC that uses an NVIDIA GTX 1080 to power the simulation.
System Overview
- SENSORIUM
The Sensorium module is primarily in charge of the sensor and the sensor data. The multiple sensors on the Sensorium are capable of detecting and collecting information regarding color, electromagnetic interface (EMI), gyroscope, inertial measurement unit (IMU), proximity, software defined radio (SDR),thermal and temperature of the surrounding environment. The controller component is the set of scripts on the Raspberry Pi that allows Sensorium to process the sensor data and send it to AWS. The Sensorium will post all of the data gathered and processed to this interface in a JSON format and this will be the means of communicating with AWS module. - AWS DESCRIPTION
The AWS module is in charge of receiving, storing, and sending the data to and from the database. It will host a web server that the Sensorium subsystem will communicate with. This web server will receive the data collected and sent by the Sensorium and store it in an AWS database. The web server will also handle heavier calculations that may be necessary to create our simulation. - VR SIMULATION
The Virtual Reality Simulation is in charge of interpreting the data into an overlay, allow the user to move around the simulation and interact with data. The simulation subsystem will consist of and interface to connect with AWS, a controller to process all of the information, User input from the Oculus headset and controllers, a display for the user to see, the Models in the simulation, and an overlay that will arrange all of the data into a format understandable by the user.
Results
Future Work
Future goals for ViSION may include real-time signal data updates, location coverage expansion and including data other than Wi-Fi signals such as Bluetooth and walkie-talkie channels.
Project Files
Project Charter (Here)
System Requirements Specification (Here)
Architectural Design Specification (Here)
Detailed Design Specification (Here)
Poster (link)
References
1.Liu, S., and Bobrow, J. E., “An Analysis of a Pneumatic Servo System and Its Application to a Computer-Controlled Robot,” ASME Journal of Dynamic Systems, Measurement, and Control, 1988, Vol 110 pp 228-235.
2.Ciro Donalek.Why does immersive data visualization matter.2017.https://www.virtualitics.com/why-immersive-data-visualization-matter/.
3.Adam Fabio. Mapping wifi signals in three dimensions, 2015.https://hackaday.com/2015/02/17/mapping-wifi-signals-in-3-dimensions/.
4.ArwaMMboya.Datavisualizationinvirtualreality,avrdemoproject.2017.https://medium.com/inborn-experience/data-visualization-in-virtual-reality-a-vr-demo-project-a31c577aaefc.