geyerChristopher Geyer, Shankar Sastry, Ruzena Bajcsy
University of California, Berkeley

By combining notions from geometry, signal processing and harmonic
analysis, we propose a new method for the estimation of the motion
between two omnidirectional cameras. We show that a densely sampled
likelihood function can be obtained on the space of essential matrices
via a convolution of two signals. The first signal expresses the
epipolar geometry of two views, and the second signal encodes the
similarity of intensities (or some other measure) between a pixel in
one image and a pixel in another image. The proposed method is
analogous to a Hough or Radon transform on the space of essential
matrices, and is a first step to integrating signal processing and
geometry. For computational reasons, we are not aware of researchers
attempting a Hough transform on the space of essential matrices, so we
are not aware of similar work. Nevertheless, there are some
similarities between the proposed method and the recent work of
Makadia and Daniilidis [Makadia et al., CVPR 03] and Wexler et
al. [Wexler et al., CVPR 03]. In the former case the authors propose
rotation estimation using a shift theorem in SO(3), and the latter
investigates the estimation of arbitrary epipolar geometries. The
breakthrough in this paper is that we can efficiently compute the
convolution using spherical and rotational harmonic representations of
the signals. Estimation using the proposed method has several
advantages: we can automatically represents ambiguities; we are able
to estimate multiple motions; and we obtain a framework which can take
into account arbitrary, non-Gaussian sensor noise models such as
simple blob correspondence.

[PDF]

Program