In/E Motion

Team Name

Mosaic Movements

Timeline

Summer 2024 – Fall 2024

Students

  • Derrick Perry – Software Engineer
  • Sophia Dao – Computer Science
  • John Calma – Computer Science
  • Asad Mirza – Computer Science

Sponsor

Leah Mazur & Laurie Taylor – UTA Fine Arts & Dance

Abstract

The In/E Motion project aims to integrate advanced motion tracking technology with real-time animation projection to enhance live performances. By capturing detailed movement data from performers and participants using high-resolution Intel RealSense cameras and motion tracking software, the system processes this data to generate dynamic animations projected onto the performance space. This creates an immersive and interactive experience, enhancing the visual appeal of performances in theater, dance, and live art installations.

Background

The Fine Arts Department at the University of Texas at Arlington is dedicated to advancing the arts through innovation and technology. In traditional performance spaces, the interaction between performers, the audience, and the environment is often limited to passive observation. This lack of interactivity can result in a disengaging experience for the audience and restrict the creative expression of performers. The advent of advanced motion tracking and real-time animated projection technologies presents a significant opportunity to transform performance spaces. By integrating these technologies, we can create an interactive environment that responds dynamically to the movements of both performers and the audience. This project aims to address the current limitations and unlock new dimensions of artistic expression and audience engagement.

The In/E Motion project was started in the spring of 2024 as a way for the arts and computer science to collaborate. MosaicMovement, consisting of senior design students from the computer science departments, has been selected to work on this project due to their expertise in technology and innovation. The collaboration with the Fine Arts Department is a continuation of an existing relationship aimed at integrating technology with artistic endeavors. The previous team laid the groundwork by exploring the feasibility of using motion tracking in performance spaces. With the performance set for Spring 2025, the performers are still in the process of refining their choreography and finalizing the stage design. Our team has built upon this foundation and created a flexible and versatile package that offers as much room for creativity as possible, while still providing clear, practical guidance. This approach ensures that the future teams have an easy-to-understand resource that also effectively captures and communicates the central theme of the performance, allowing them to build upon it with ease and confidence and bring the project closer to realization.

This project aligns with the Fine Arts Department’s mission to innovate and enhance the educational, cultural, and community impact of the arts. By creating an interactive and immersive environment, we aim to revolutionize how performances are experienced and elevate the standards of artistic expression and audience engagement.

Project Requirements

  • Real-Time Motion Capture: The system must capture and process motion data in real-time, with minimal latency, ensuring synchronization with live performances.
  • Real-Time Animations: High-quality animations must be generated based on the captured motion data, accurately reflecting the performers’ movements.
  • Reliable System Performance: The system must operate reliably during live performances, equipped with dependable hardware and software components to minimize downtime.

Design Constraints

  • Economic: The system must use affordable motion tracking sensors to be cost-effective for small performance groups.
  • Environmental: The system should operate reliably in various indoor environments.
  • Health & Safety: Sensors must be safe to use around performers, ensuring no interference with their movements.
  • Sustainability: The system should be designed for easy maintenance and long-term use.
  • Usability: The installation process must be straightforward to minimize user frustration.
  • Hardware: Requires high-performance processors and efficient motion tracking sensors.
  • Software: Code must be optimized for efficiency and low latency.
  • Regulations: Must comply with data privacy regulations.
  • Internet Access: Updates require a stable internet connection for download and installation.
  • Documentation: Manuals must be clear and easy to understand for future cohorts.

Engineering Standards

  • Occupational Safety and Health Standards 1910.147: Equipment usage, due to lock removal policies, will be limited to availability of the course instructor and designated teaching assistants.
  • NFPA 70 and IEC 60364: High voltage power sources, as defined in NFPA 70, will be avoided as much as possible in order to minimize potential hazards. All electrical connections in the In/E Motion system must be properly packaged and grounded to avoid any risk of electrical shock to users. This includes ensuring that all exposed wires are insulated and all electrical components are housed in secure enclosures.
  • GDPR and CCPA: The system must ensure the privacy of all user data, including motion capture data and personal in-formation. This includes anonymizing data where possible and providing users with control over their data.
  • ISO 42010: The system architecture must be modular to allow for easy updates and extensions. This includes designing components that can be independently upgraded or replaced and providing APIs for integrating additional features.
  • ISO/IEC 25010: The system must be optimized for performance to ensure smooth operation during live performances. This includes optimizing the motion capture data processing, animation rendering, and system responsiveness.

System Overview

The system uses Intel RealSense D435 with calibration tools to ensure precise and synchronized data capture, managed by sophisticated data capture Nuitrack software that includes RGB and depth configurations. Data analysis components employ pattern recognition and interaction detection to interpret movements and generate corresponding animations. These animations are rendered in real-time using Unity and projected onto the performance space through high-resolution projectors. Below is the breakdown of the system’s architecture:

  • Input Layer: This layer captures raw motion data using Intel RealSense D435 cameras. The data is then transmitted to the Translation Layer for processing.
  • Translation Layer: In this layer, the raw data is processed by the Nuitrack software plugin, analyzed for interactions, and prepared for rendering. The visualizations were created using Unity. This layer ensures that the captured movements are translated into animations.
  • Output Layer: The final processed animations are projected onto the performance space using high-resolution projectors. The Output Layer also facilitates interaction with participants, adjusting animations in real-time based on their movements.

Results

We created a demo package using Unity and the Nuitrack SDK. The package consists of about 20 scenes that imitate the flow of the performance. We used the music from a previous iteration to help create the package. Inside the package exists several scenes that involve motion tracking and different ways to use user inputs developed by Nuitrack. Object collision, gesture reading, and 3D depth interactions were all ways we wanted to use for the foundation of our package. Our goal was to implement ways to interact with the system so that the future students working on the In/E Motion project could easily extract any of these methods to use in any way they needed for future development.

Future Work

The future of the In/E Motion project will include further development of the system as the performance date gets closer and more of the details get defined. Working and testing with the performers is vital for the project which will begin in January of 2025. Our demo package for the project will help the future students understand the requirements and vision of the project as well. Each scene was carefully designed with a specific purpose in mind, allowing future teams to easily adapt and customize them as needed. With the performance set for next Spring, the performers are still in the process of refining their choreography and finalizing the stage design. Our primary goal was to create a flexible and versatile package that offers as much room for creativity as possible, while still providing clear, practical guidance. This ensures that the future teams have an easy-to-understand resource that also effectively captures and communicates the central theme of the performance, allowing them to build upon it with ease and confidence.

Project Files

Project Charter
System Requirements Specification
Architectural Design Specification
Detailed Design Specification
Poster

References

  1. 3DiVi. “3DiVi/Nuitrack-SDK: NuitrackTM Is a 3D Tracking Middleware Developed by 3DiVi Inc..” GitHub, github.com/3DiVi/nuitrack-sdk. Accessed 20 Nov. 2024.
  2. “Unity Documentation.” Unity Documentation, docs.unity.com/. Accessed 20 Nov. 2024.

Steven McDermott