Structure from motion for omni-directional multi-camera system and its applications.

Tomokazu Sato
(Nara Institute of Science and Technology, Japan)

In this talk, I will present a method for estimating extrinsic camera parameters for an omni-directional multi-camera system (OMS) by tracking feature points. Basically, extrinsic camera parameters for an OMS can be estimated with bundle adjustment only if good initial parameters are given with trajectory of feature points. In this research, initial parameters are estimated with visual-SLAM approach by tracking image features before applying bundle adjustment. Recent progress of camera parameter estimation method that incorporate GPS measures into bundle adjustment will also be introduced. In Addition, some applications of camera parameter estimation for omni-directional vision like 'omni-directional novel-view synthesis', 'omni-directional telepresence system without unvisible area' and 'Feature landmark based augmented reality system' will also be demonstrated with some videos.