Pedestrian Behavior Modeling
As part of the research project SAFIR IP6 Mirasoft, we aim to better understand pedestrian behavior and how automated driving functions can be tested for everyday traffic situations, which inevitably contain pedestrians and other VRUs. Instead of using a real human in the experiment, like in our MiRE proof of concept test environment, the goal is to develop a pedestrian behavior model.
Pedestrian Motion Capture
One method we employ for pedestrian behavior modeling is motion capture. Using an MVN XSens motion capture suit, we place participants in different traffic scenarios and record their movement behavior.
Using Motion Capture Methods (MVN XSens) we recorded the behavior of pedestrians in crossing scenarios.
Using an inverse kinematics solver, the motion is translated to a virtual avatar.
Participants were immersed in virtual reality to elicit naturalistic movement and behavior.
This is an exemplary testrun from the motion capture study we conducted.
The recorded motion data can be used for example to create high-fidelity avatars for pedestrian- or driving-simulation.
But mainly we aim to find common behavioral patterns through gathering this movement data from a larger sample.
Pedestrian Model for Distributed Simulation Architectures
As part of these research efforts, we also aim to make this pedestrian model accessible.
To do this, I have been contributing to the ASAM Open Simulation Interface (OSI) standard and helped in developing a pedestrian model for OSI.