Applied Research Laboratories
Worked as a student technician/researcher on the Live to Synthetic Experience Technology project in collaboration with the Arts and Entertainment Technologies department at UT Austin.
Objectives
-
Establish a collaborative partnership between ARL:UT and AET to combine expertise in data engineering, learning engineering, motion capture technologies, and game engine technologies to address gaps in experience capture and analysis in Army/DoD training environments.
-
Identify opportunities for expanded collaboration with other UT entities to address science and technology gaps.
Strategies
-
Combine ARL:UT's knowledge in data and learning engineering with AET's knowledge in motion capture and game engine technologies.
-
Bridge the military and entertainment sectors to enhance human performance understanding and analysis
What I Did
Use photogrammetry to capture physical spaces and create a 3D representation of the space
-
Export to Maya to clean up surface normals or to patch missing faces
Conduct live motion capture and translate to synthetic environments.
-
DeepMotion and Wonder Dynamics for markerless motion capture.
-
Export motion capture data to Maya
-
Export animation from Maya to a real-time game engine