Localization of multiple near-duplicate fragments of images

Mariusz Paradowski, Andrzej Sluzek
(Wroclaw University of Technology, Poland)

Abstract:

One of the advanced techniques in visual information retrieval is detection of near-duplicate fragments, where the objective is to identify images containing almost exact copies of unspecified fragments of a query image. Such near-duplicates would typically indicate the presence of the same object in images. Thus, the assumed differences between near-duplicate fragments should result either from image-capturing settings (illumination, viewpoint, camera parameters) or from the object’s deformation (e.g. location changes, elasticity of the object, etc.). The presented method works on purely visual data, i.e. no semantics or a priori information is needed. Our approach does not require any models of the objects of interest, and it works on any images containing multiple objects on random, diversified backgrounds. Two cases are discussed. First, we assume that near-duplicates are (approximately) related by affine transformations, i.e. the underlying objects are locally planar. Secondly, we allow more random distortions so that a wider range of objects (including deformable ones) can be considered. Thus, we exploit either the image geometry or image topology.