Pauwels, Karl

NVIDIA GPU Technology Conference (GTC), 2014

BibTeX Citation
Presentation

Discover how hundreds of objects can be simultaneously located and tracked in 3D through the real-time combination of visual simulation and visual perception. A tight integration of GPU graphics and compute has allowed us to continuously update a 3D scene model on the basis of dense visual cues, while at the same time feeding back information from this model to facilitate the cue estimation process itself. In this session we will describe (1) the low-level dense motion and stereo engine that can exploit such model feedback, (2) the 6DOF pose (location and orientation) estimation of hundreds of rigid objects at 40 Hz, and (3) how the same framework enables multi-camera and/or complex articulated object tracking. Throughout the session, we will pay special attention to implementation and system integration aspects of our real-time demonstrator system.