Using camera recordings, the current work seeks to develop an automated method for generating virtual war fighter models to predict blast exposure in weapon training scenarios. The key question is whether we can accelerate the process of creating these virtual service member models for rapid exposure estimation. This work uses the latest machine learning-based tools for 3D human pose estimation from a single camera.
These tools allow us to extract the position and posture of each person in an image, streamlining the process of simulating a blast exposure. Using other sensors is difficult because service members in training don't have time to put on many different sensors. However, a camera can easily record a military training session, so our work leverages that modality to overcome the limitations of other types of sensors.
Our BOP tool is the first computational tool to predict blast of our pressure on service members using fast running models optimized with field pressure and sensor data. It systematically replicates service members'posture and position during weapon firing, aiming to accurately estimate over pressure on different anatomical regions of the service member. We aim to enhance and transition the current blast or pressure tool into a real-time blast over pressure-monitoring product for accumulator exposure.
We will also work towards correlating this dose to TBA response for injury risk assessment.