Omnidirectional imaging: geometry and signal analysis

Kostas Daniilidis (University of Pennsylvania, USA)

Immersive visualization is rapidly becoming very popular with the dissemination of platforms enabling switching among viewpoints and viewing directions. Immersive sensing is best described by the notion of the plenoptic function. In this talk I will provide ways to analyze samplings of the plenoptic function beyond the traditional perspective plane starting from omnidirectional systems with a single viewpoint.

I will present a new unifying theory of panoramic image formation covering all central omnidirectional sensors as well as any conventional pinhole camera. The model is based on a spherical projection followed by a projection from the sphere to the omnidirectional plane. The natural domain to process an omnidirectional signal is the sphere considered as a homogeneous space with the group action of rotation. By applying a Fourier transform on rotations we are able to obtain direct attitude information without point or line correspondences.

To describe more general mappings of omnidirectional planes we consider a new representation where the omnidirectional plane is lifted to a 3D circle space where transformations preserving points can be modeled as elements of the Lorentz group SO(3,1). Such a mapping models also the intrinsic geometry of an omnidirectional camera and it turns out that it drastically simplifies the problem of 3D-motion estimation. The additional robustness of of a huge field of view make such sensors irreplaceable in navigational tasks.