Team ViRal: Project ViSION

ViRal Logo

Team Name

ViRal

Timeline

Fall 2018 – Spring 2019

Students

  • Luke Hardin
  • Dawsen Richins
  • Anna Cox
  • Jake Nissley
  • Saurya Bhattarai

Sponsor

Ratheon

Abstract

It is very difficult to visualize the coverage range of the many wireless emitters in our daily lives, such as WiFi routers, therefore making “dead zones,” or areas without coverage, all too common. Our project, VIrtual Sensorium Interactive Overlay Navigator (ViSION), is a system that allows users to see real-world coverage range of all wireless networks present at the University of Texas at Arlington, through a virtual reality environment. Users of ViSION software will be able to see and experience an accurate virtual reality model of the campus that includes a visual representation of the coverage range of all wireless networks present on campus. This system can enable network engineers to better optimize router placement to ensure consistent coverage and effectively troubleshoot weak signals.

Background

Currently, there is no software available that can map the range and meta-data of different signal sources into a virtual reality environment; two-dimensional maps are the only option. This leaves out a large amount of data that could be mapped and makes it much more difficult to visualize the layout of wireless signals in a given area. Raytheon, a defense contractor, is the sponsor for this project and are interested in expanding these two-dimensional maps into a three-dimensional virtual reality environment. The University of Texas at Arlington students, faculty, and staff could use virtual reality goggles to see a dome of wireless signals, where the center of the dome is the signal source and the edge of the dome is the end of the range of the signal. Signal meta-data such as the SSID, IP address, and connection strength would also be visible on this wireless dome. This is where our project comes in: we will use data collected by sensors placed on a drone to create the virtual environment of these signals. The drone will be capable of flying all through the campus, collect data, and pair it with the height and GPS coordinates of the drone when the data was collected. Our software would then ingest this information, map it to the virtual environment of UTA, and then users would use virtual reality goggles to get a deeper view into the wireless signals that surround us. This technology is currently not available to the public and presents a new market for Raytheon to enter and to become a market leader.

Project Requirements

  1. The system shall be a Virtual Reality environment with accurate model of the University of Texas at Arlington.
  2. The system shall utilize only data detected by the sensorium on the drone’s flight path.
  3. The system shall display each wireless signal as a cluster of data points in a heatmap color scale based on the relative strength.
  4. The system shall display the origin point of each signal determined by a trilateration algorithm.
  5. The system shall map each signal to the accurate height and GPS coordinates.
  6. The user shall be able to click on signals to learn more information about that signal.
  7. The user shall be able to maneuver around the VR environment.
  8. The system shall update the signal data offline based on data collected during latest flight session of the drone.
  9. The system shall use Amazon Web Services (AWS) for raw data processing and storage.
  10. The system shall run on Windows 10 PC that uses an NVIDIA GTX 1080 to power the simulation.

System Overview

The Sensorium has a conglomeration of sensors on it that pickup relevant data to wireless signals such as signal strength. This data is passed to AWS where it is collected and parsed by the controller. In AWS the trilateration algorithm takes place which helps determine the center point of all the relevant signals. This data is then put into the database and passed to our simulation. In the Unity environment our pre-built model and overlay come together with that data from AWS to create a 3D mapping of wireless signals for the user to see and interact with using the Oculus system. The 3 main sub-systems are:
  • SENSORIUM
    The Sensorium module is primarily in charge of the sensor and the sensor data. The multiple sensors on the Sensorium are capable of detecting and collecting information regarding color, electromagnetic interface (EMI), gyroscope, inertial measurement unit (IMU), proximity, software defined radio (SDR),thermal and temperature of the surrounding environment. The controller component is the set of scripts on the Raspberry Pi that allows Sensorium to process the sensor data and send it to AWS.  The Sensorium will post all of the data gathered and processed to this interface in a JSON format and this will be the means of communicating with AWS module.
  • AWS DESCRIPTION
    The AWS module is in charge of receiving, storing, and sending the data to and from the database. It will host a web server that the Sensorium subsystem will communicate with. This web server will receive the data collected and sent by the Sensorium and store it in an AWS database. The web server will also handle heavier calculations that may be necessary to create our simulation.
  •  VR SIMULATION
    The Virtual Reality Simulation is in charge of interpreting the data into an overlay, allow the user to move around the simulation and interact with data. The simulation subsystem will consist of and interface to connect with AWS, a controller to process all of the information, User input from the Oculus headset and controllers, a display for the user to see, the Models in the simulation, and an overlay that will arrange all of the data into a format understandable by the user.

Results

 

Future Work

Future goals for ViSION may include real-time signal data updates, location coverage expansion and including data other than Wi-Fi signals such as Bluetooth and walkie-talkie channels.

Project Files

Project Charter (Here)

System Requirements Specification (Here)

Architectural Design Specification (Here)

Detailed Design Specification (Here)

Poster (link)

References

1.Liu, S., and Bobrow, J. E., “An Analysis of a Pneumatic Servo System and Its Application to a Computer-Controlled Robot,” ASME Journal of Dynamic Systems, Measurement, and Control, 1988, Vol 110 pp 228-235.

2.Ciro  Donalek.Why  does  immersive  data  visualization  matter.2017.https://www.virtualitics.com/why-immersive-data-visualization-matter/.

3.Adam Fabio.  Mapping wifi signals in three dimensions, 2015.https://hackaday.com/2015/02/17/mapping-wifi-signals-in-3-dimensions/.

4.ArwaMMboya.Datavisualizationinvirtualreality,avrdemoproject.2017.https://medium.com/inborn-experience/data-visualization-in-virtual-reality-a-vr-demo-project-a31c577aaefc.

chenc4