Team Name
In/E Motion
Timeline
Fall 2024 – Spring 2025
Students
- Rudy Orozco – Computer Science
- Tara Chandrakasem – Computer Science
- Jacob Truelove – Computer Science
- Subham Pokhrel – Software Engineering
- Areeb Khan – Computer Science
- Simara Peyton – Computer Engineering
- Junya Ogawa – Computer Science
- Aiden Sparks – Computer Science
- David Akinmade – Software Engineering
- Ramon Torres – Computer Science
- Shanlum Shadan – Computer Science
- Jackson Pittman – Computer Science
- David Nguyen – Computer Science
- Tyler Crouch – Computer Science
Sponsor
Leah Mazur & Laurie Taylor – UTA Fine Arts & Dance
Abstract
Abstract
The In/E Motion project explores the combination of real-time motion tracking and dynamic visual effect projection to create immersive and interactive live performance experiences. Using high-resolution cameras and advanced motion tracking software, the system captures and processes performers’ movements with minimal latency. This data is then used to generate immersive visual effects that are projected onto the performance space in real time, enhancing performance experience for the audience. Additionally, real-time participant interaction monitoring allows for adaptive and interactive visual storytelling, setting a new standard for immersive live art.
Background
In traditional performance spaces, the interaction between performers, the audience, and the environment is often minimal. This lack of interactivity can lead to a disengaging experience for the audience and constrain the creative expression of performers. However, the advent of advanced motion tracking and real-time animated projection technologies presents a significant opportunity to transform these spaces. By integrating such technologies, we can create interactive environments that respond to the movements of both performers and audience members. With this vision in mind, the Fine Arts Department at the University of Texas at Arlington initiated a collaborative project with the Computer Science and Engineering Department, driven by their shared commitment to advancing the arts through innovation and technology. This project seeks to address current limitations and unlock new dimensions of artistic expression and audience engagement.
The In/E Motion project was launched in the spring of 2024 as a collaborative effort between the Arts and Computer Science departments. Over the course of nearly a year and a half, several senior design teams have contributed to the project, working toward a firm deadline set for early April 2025 to ensure everything was ready for the scheduled performances later that month. The previous teams laid the groundwork by exploring the feasibility of using motion tracking in performance spaces, testing the appropriate tools for the project, and developing prototype scenes aligned with the shared vision of the sponsors. Our team, active from Fall 2024 to Spring 2025, built upon this foundation by fully understanding the project’s vision and requirements, ultimately creating an automated compilation of scenes with integrated effects that successfully ran during the April 2025 performances with exceptional results. This project aligns with the Fine Arts Department’s mission to innovate and enhance the educational, cultural, and community impact of the arts. By creating an interactive and immersive environment, we aimed to revolutionize how performances are experienced and elevate the standards of artistic expression and audience engagement.
Project Requirements
- Real-Time Motion Capture: The system must capture and process motion data in real-time, with minimal latency, ensuring synchronization with live performances.
- Real-Time Animations: High-quality animations and effects must be generated based on the captured motion data, accurately reflecting the performers’ movements.
- Reliable System Performance: The system must operate reliably during live performances, equipped with dependable hardware and software components to minimize downtime.
- User-Friendly Interface: The system must provide a user-friendly interface for technicians to set up, control, and monitor the performance. The interface should include a control panel for starting, pausing, and stopping the system, a live preview of the motion data and animations, and settings for customization.
- Sophisticated Content: The exhibit in/e motion is made for is artistic and the effects made by it must fit with the atmosphere of the performance. This takes the form of allowing the effects to generate more mature content.
- Real-Time Data Processing: The system must process motion data in real-time with a maximum latency of 50 milliseconds from data capture to effect projection. This ensures that the animations are synchronized with the performers’ movements without noticeable delay.
- System Setup and Shutdown Time: The system must be ready to use within 2 minutes of being powered on. This includes booting up the processing unit, initializing motion tracking sensors, and loading the software interface. And then, the system must complete the shutdown process within 1 minute. This includes saving any necessary data, powering down sensors, and closing the software interface.
- Performance Optimization: The system must be optimized for performance to ensure smooth operation during live performances. This includes optimizing the motion capture data processing, rendering, and system responsiveness.
- Data Privacy: The system must ensure the privacy of all user data, including motion capture data and personal information. This includes anonymizing data where possible and providing users with control over their data.
- Support and Troubleshooting Manuals: Comprehensive support and troubleshooting manuals must be provided, detailing common issues, diagnostic procedures, and solutions. These manuals should be available in both digital and printed formats.
Design Constraints
- Economic: The system must use affordable motion tracking sensors to be cost-effective for small performance groups.
- Environmental: The system should operate reliably in various indoor environments.
- Health & Safety: Sensors must be safe to use around performers, ensuring no interference with their movements.
- Sustainability: The system should be designed for easy maintenance and long-term use.
- Usability: The installation process must be straightforward to minimize user frustration.
- Hardware: Requires high-performance processors and efficient motion tracking sensors.
- Software: Code must be optimized for efficiency and low latency for real-time processing.
- Regulations: Must comply with data privacy regulations.
- Internet Access: Updates require a stable internet connection for download and installation.
- Documentation: Manuals must be clear and easy to understand for future cohorts.
Engineering Standards
- Occupational Safety and Health Standards 1910.147: Equipment usage, due to lock removal policies, will be limited to availability of the course instructor and designated teaching assistants.
- NFPA 70 and IEC 60364: High voltage power sources, as defined in NFPA 70, will be avoided as much as possible in order to minimize potential hazards. All electrical connections in the In/E Motion system must be properly packaged and grounded to avoid any risk of electrical shock to users. This includes ensuring that all exposed wires are insulated and all electrical components are housed in secure enclosures.
- GDPR and CCPA: The system must ensure the privacy of all user data, including motion capture data and personal information. This includes anonymizing data where possible and providing users with control over their data.
- ISO 42010: The system architecture must be modular to allow for easy updates and extensions. This includes designing components that can be independently upgraded or replaced and providing APIs for integrating additional features.
- ISO/IEC 25010: The system must be optimized for performance to ensure smooth operation during live performances. This includes optimizing the motion capture data processing, animation rendering, and system responsiveness.
System Overview
The system uses Xbox Kinect sensors with calibration tools to ensure precise and synchronized data capture, managed by sophisticated data capture Nuitrack software that includes RGB and depth configurations. Data analysis components employ pattern recognition and interaction detection to interpret movements and generate corresponding animations. These animations are rendered in real-time using Unity and projected onto the performance space through high-resolution projectors. Below is the breakdown of the system’s architecture:
- Input Layer: This layer captures raw motion data using Xbox Kinect sensors. The data is then transmitted to the Translation Layer for processing.
- Translation Layer: In this layer, the raw data is processed by the Nuitrack software plugin, analyzed for interactions, and prepared for rendering. The visualizations were created using Unity. This layer ensures that the captured movements are translated into animations and effects.
- Output Layer: The final processed animations are projected onto the performance space using high-resolution projectors. The Output Layer also facilitates interaction with participants, adjusting animations in real-time based on their movements.
Results
The In/E Motion project successfully concluded this semester after a year and a half of development. By capturing and interpreting performers’ movements in real time, the system generated dynamic visuals that responded instantly to human motion, resulting in a compelling and engaging performance for the client’s team. We developed a package using Unity and the Nuitrack SDK, consisting of approximately 18 scenes that mimic the flow of the performance.
The package includes several scenes featuring motion tracking and various user input methods enabled by Nuitrack, along with other features offered by Unity, such as object collision, gesture recognition, and 3D depth interactions. These formed the foundation of our system. Throughout the project, we encountered multiple challenges—one of the most significant being troubleshooting the original camera setup. This issue persisted until we replaced the small cameras with Xbox One Kinect devices, which significantly improved tracking performance.
Another challenge was determining the artistic direction of the generated visuals. Although the client encouraged creative freedom, our limited background in the arts required us to make interpretive assumptions about the desired aesthetic. Nevertheless, we successfully created an automated compilation of scenes that aligned with the artist’s vision, resolved technical issues, and effectively implemented the system during rehearsals and the final performance.
The performance was showcased in late April. More information is available at the following link:
https://www.uta.edu/academics/schools-colleges/liberal-arts/departments/theatre/box-office/in-e-motion
Future Work
The In/E Motion project successfully concluded this semester after a year and a half of development. However, it has opened the door to new opportunities for similar interactive and immersive performances. This innovative production has set a benchmark for what’s possible, demonstrating how future engineers can enhance and develop such systems to enable more artistic and engaging experiences.
The scenes in the current performance can certainly be improved by adding layers of complexity, something that was not feasible for us due to time constraints and strict deadlines. These enhancements can be explored further if the sponsor chooses to continue developing the project.
Since many engineers involved had limited backgrounds in the arts, the concept of creative freedom was sometimes difficult to fully grasp. Moving forward, more direct discussions around artistic expectations will be essential to ensure better alignment between technical implementation and creative vision.
Project Files
Project Charter
System Requirements Specification
Architectural Design Specification
Detailed Design Specification
Poster
References
1. 3DiVi. “3DiVi/Nuitrack-SDK: NuitrackTM Is a 3D Tracking Middleware Developed by 3DiVi Inc..” GitHub, github.com/3DiVi/nuitrack-sdk. Accessed 20 Nov. 2024.
2. “Unity Documentation.” Unity Documentation, docs.unity.com/.