top of page

Applied Research Laboratories

Worked as a student technician/researcher on the Live to Synthetic Experience Technology project in collaboration with the Arts and Entertainment Technologies department at UT Austin.

Objectives

  • Establish a collaborative partnership between ARL:UT and AET to combine expertise in data engineering, learning engineering, motion capture technologies, and game engine technologies to address gaps in experience capture and analysis in Army/DoD training environments.

  • Identify opportunities for expanded collaboration with other UT entities to address science and technology gaps.

Strategies

  • Combine ARL:UT's knowledge in data and learning engineering with AET's knowledge in motion capture and game engine technologies.

  • Bridge the military and entertainment sectors to enhance human performance understanding and analysis

What I Did

Use photogrammetry to capture physical spaces and create a 3D representation of the space
  • Export to Maya to clean up surface normals or to patch missing faces
Conduct live motion capture and translate to synthetic environments.
  • DeepMotion and Wonder Dynamics for markerless motion capture.
  • Export motion capture data to Maya
  • Export animation from Maya to a real-time game engine
Conduct live motion capture and translate to synthetic environments.
  • Utilized Unity's vector library to capture data metrics
  • Deployed Wonder Dynamics mocap output into an accurate digital twin of the LIM
  • Cleaned up animations and 3D scans in Maya
  • Created a camera sequence in Unity to help visualize the space
Pre-Background Removal
With Background Removal
Utilized the powerful rotoscoping tools in Adobe After Effects to remove backgrounds in select videos. This was extremely useful for videos in which the subjects were difficult for the mocap software to spot.

Current Work

Unreal Engine 5
I am currently working on the development and design of a prototype game to prepare soldiers for live training. This game will be based on some of my previous work on ARL and existing Army training games. Some of the work cannot be shared due to lab policies.

Updates to come
Prototype Replay System
This is a prototype of a replay system made in Unreal Engine 5 and mocap data from Wonder Dynamics. This is an improvement from the Unity prototype, with the ability to go first person, pause, free camera mode, and spawn more actors in the scene.​​
 
  • Pause and Play mocap data via button press
  • 'Q' and 'E' keys to switch between camera viewpoints
  • Synced animation sequences with original video (pause/plays with animations)
  • Utilized vector math to calculate distances, gaze direction, and to check if the other subject is in view
Continued Work
​​
  • Reading in CSV eyetracking data into Unreal for additional metrics
    • Utilizing Unreal's materials and render texture capabilities to generate heatmaps​
  • Pairing motion capture data with geospatial data from Cesium
  • Contributing to the design and programming of a training game in Unity
bottom of page