A Framework for Video-Driven Crowd Synthesis

03/13/2018
by   Jordan Stadler, et al.
0

We present a framework for video-driven crowd synthesis. Motion vectors extracted from input crowd video are processed to compute global motion paths. These paths encode the dominant motions observed in the input video. These paths are then fed into a behavior-based crowd simulation framework, which is responsible for synthesizing crowd animations that respect the motion patterns observed in the video. Our system synthesizes 3D virtual crowds by animating virtual humans along the trajectories returned by the crowd simulation framework. We also propose a new metric for comparing the "visual similarity" between the synthesized crowd and exemplar crowd. We demonstrate the proposed approach on crowd videos collected under different settings.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset