Detection and segmentation of dynamic texture in video.

Dmitry Chetverikov (SZTAKI Budapest, Hungary)

We propose brightness conservation for modeling dynamic textures and describe a simple method to calculate a non-regular optical flow based on this concept. This non-regular flow can model brightness diffusion to neighboring image regions and thus provides a more general description than regular optical flows based on brightness constancy. Using motion analysis employing regular and non-regular optical flow calculation, we propose a method for detecting dynamic textures in video. Segmentation of video frames into regions obeying different non-parametric motion models is achieved with a variational level set technique.

Large number of experimental results demonstrate the suitability of our method for detection of dynamic textures even in challenging situations when other visual cues such as color and geometry are not usable. Look at some sample results. Based on gradual simplifications targeting higher run-time efficiency, several variants -- including a real-time version -- are developed.