top of page

Applied Research Laboratories

Worked as a student technician/researcher on the Live to Synthetic Experience Technology project in collaboration with the Arts and Entertainment Technologies department at UT Austin.

Objectives

  • Establish a collaborative partnership between ARL:UT and AET to combine expertise in data engineering, learning engineering, motion capture technologies, and game engine technologies to address gaps in experience capture and analysis in Army/DoD training environments.

  • Identify opportunities for expanded collaboration with other UT entities to address science and technology gaps.

Strategies

  • Combine ARL:UT's knowledge in data and learning engineering with AET's knowledge in motion capture and game engine technologies.

  • Bridge the military and entertainment sectors to enhance human performance understanding and analysis

What I Did

Use photogrammetry to capture physical spaces and create a 3D representation of the space
  • Export to Maya to clean up surface normals or to patch missing faces
Conduct live motion capture and translate to synthetic environments.
  • DeepMotion and Wonder Dynamics for markerless motion capture.
  • Export motion capture data to Maya
  • Export animation from Maya to a real-time game engine
Collect a select set of measurements for Army combat-related tasks and achieve the level of fidelity needed to properly evaluate target measures. Above are videos of our Unreal and Unity prototypes. I was responsible for the development of these prototypes, which included all the scripting and implementation of the mocap data and digital twins.

Current Work

Unreal Engine 5
I am currently working on the development and design of a prototype game to prepare soldiers for live training. This game will be based on some of my previous work on ARL and existing Army training games.

Updates to come
bottom of page