D.L.R.S. (Drone Landing Recovery System)

Team Name

BLACK BOX

Timeline

Spring 2023 – Summer 2023

Students

  • Saed Abdulhadi
  • Jonathan Dena
  • Zelik Cosio-Altamirano
  • Sworup Bhattarai
  • Fariha Rabbi

Abstract

Black Box’s DRLS is an innovative drone recovery and landing system, engineered to activate automatically during free fall, enabling a controlled descent while simultaneously transmitting crucial information to facilitate seamless recovery by the pilot.

The primary focus and intended application context lie within defense operations of unmanned aircraft, with scalability to becoming a consumer product. DRLSs’ development stems from the imperative requirement to swiftly detect, regulate, and communicate real-time data regarding the drone’s location and activities, particularly when standard operational procedures encounter failures.

D.R.L.S. is vital for enhancing drone safety, supporting defense operations, advancing technology, and complying with regulations. Its impact extends to disaster management and humanitarian aid, contributing to a safer, more efficient, and sustainable future for unmanned aircraft applications.

Background

Historically aircraft accidents prove to be costly and cartographic to those involved. The  21st century has seen the introduction of UAS (unmanned aerial systems) taking to the sky. Many with hobbyist PICs (pilot in command) and others with commercial PICs. Regardless of the user, these UAS are designed to significantly lower safety standards than manned aircraft, and often times they may lose connection or power, or suffer an engine loss that will lead to a free fall crash.

UAS are expensive so it is important to be able to find them when they are at a total loss, in a military context they often carry proprietary information, so timely recovery of the UAS is vital.

Project Requirements

  1. Detect that UAS (unmanned aerial system) is in free fall
  2. Deploy parachute to enter controlled fixed orientation decent
  3. Trigger anti-collision lights and sound alarm
  4. Configure the camera and take images of the landing site
  5. Computer vision to decide if the landing is on water or on dry land
  6. (if landing on water) Deploy raft for UAS floatation
  7. Confirm Landing
  8. Wirelessly transmit geographical location & deployment status information
  9. Wirelessly receive location and status information continuously
  10. Display location and deployment status information to the PIC (pilot in command)

System Overview

Below is a full-flow diagram of the architectural specification. At a high level, it will show the decision-making steps and active onboard systems. The fall detection layer uses a barometer, and an accelerometer to detect free fall below a defined altitude. This is the input system that starts the reaction of recovery systems. In a confirmed free-fall condition the second layer will engage a parachute it uses physical connections between the source computer a Raspberry Pi 4 and a pyrotechnic detonation device that will deploy a parachute from two mounting points along the air-frame to accommodate heavier unmanned aircraft. The effect of this layer will enter the drone out of an uncontrolled spin and enter a
stable descent. The warning layer operates simultaneously with the parachute deployment layer. The raspberry pi is equipped with relays. These relays will trigger warning lights along the drone to indicate that it is in free fall, as well as sound a buzzer. Warning anybody below the drone that the aircraft is incoming uncontrollably.
The computer vision layer will activate once the parachute deployment has been confirmed. This
is confirmed after a prescribed has passed. Once this layer is activated, the raspberry pi 4 will enable
its camera and take a picture. Python script using CV2 (Open Source Computer Vision Library) will
capture the image and perform an RGB analysis on it, looking for a percentage of the photo in the color
blue, to determine if the unmanned aircraft is above water.
If the drone is determined by the computer vision algorithm to be above water. Only then will the
system activate the raft deployment layer. This layer will enable GPIO pins on the Raspberry Pi to trigger
a relay. This will open an air valve connected to a CO2 canister to inflate a flotation device. This way, if the
drone makes a landing on water, it is expected to remain on the surface of the water to be recovered and
prevent the body of the drone from submerging into the water. In the success of this event, the program
will move on to the wireless communication layer. If the computer vision algorithm determines that the
drone will be landing on dry land it will bypass the raft deployment layer and proceed directly to the
wireless communication layer.
The wireless communication layer functions with two separate pieces of hardware on board the
aircraft. There will be a wireless transmitter, that will receive your commands from the Raspberry Pi 4
to wirelessly transmit them to a ground station. The ground station utilizes an STM 32 to receive the
UART packages over the air and displays them on a screen.

Results

D.R.L.S. Demo – Fall Detection Override

Future Work

Future development in the DRLS system will look into the following:

  1. Reducing the size of logic components to a single IC / PI Hat
  2. Reduced overall form factor
  3. The design software on FPGA lite real-time operating system
  4. Making parachutes easily swappable
  5. Inflating floatation raft using a chemical reaction system
  6. Mobile app control

Project Files

Project Charter

System Requirements Specification

Architectural Design Specification

Detailed Design Specification

Poster

sxa0021