Visual tracking: from correlation filters to end-to-end discriminative model prediction
Martin Danelljan
(ETH Zurich, Switzerland)
Abstract:
Visual tracking is one of the fundamental problems in computer vision. In
its most general form, no prior knowledge about the target object is
given, except for its initial location. The unconstrained nature of this
problem makes it particularly difficult, yet applicable to a wider range
of scenarios. The same holds for the related problem of Video Object
Segmentation, where the task is to predict a pixel-wise segmentation mask
of the target. Due to the lack of a-priori knowledge in these problems,
the method must learn an appearance model of the target online. Cast as a
machine learning problem, it imposes several major challenges. Many of
these challenges have been successfully addressed by developing powerful
and efficient discriminative models of the target appearance.
This talk will give an overview of such approaches, starting with the widely
popular Discriminative Correlation Filter (DCF) framework. The talk will focus
on the most recent developments in the field, which includes deep architectures
capable of directly predicting a discriminative target appearance model, that
itself can be trained end-to-end in a meta-learning fashion
Bio:
Martin Danelljan is a postdoctoral researcher at ETH Zurich, Switzerland.
He received his Ph.D. degree from Linköping University, Sweden in 2018.
His Ph.D. thesis was awarded the biannual Best Nordic Thesis Prize at
SCIA 2019. His main research interests are online and meta-learning
methods for visual tracking and video object segmentation, deep
probabilistic models for image generation, and machine learning with no
or limited supervision. His research in the field of visual tracking, in
particular, has attracted much attention. In 2014, he won the Visual
Object Tracking (VOT) Challenge and the OpenCV State-ofthe-Art Vision
Challenge. Furthermore, he achieved top ranks in VOT2016 and VOT2017
challenges. He received the best paper award at ICPR 2016 and best
student paper at BMVC 2019.