Let the machine become the next Rembrandt!

Daniel Sykora (CTU Prague, Czech Republic)


Recently, neural-based style transfer becomes extremely popular thanks to the seminal work of Gatys et al. [2016] and its followers including Selim et al. [2016] who extended their technique to stylize head portraits. Although their method produces impressive results on specific styles, there is still a small but essential ingredient missing: rich textural information. This ingredient gives the observer impression that the stylized portrait was created by hand using physical, artistic media and makes it indistinguishable from the real artwork. In this talk, we will investigate this fundamental limitation of current neural-based techniques and propose a solution based on a robust variant of patch-based synthesis [Fiser et al. 2017] which enables synthetic creation of highly convincing stylized portraits.

Gatys et al.: Image Style Transfer Using Convolutional Neural Networks. In
Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.
2414-2423, 2016.

Selim et al.: Painting Style Transfer for Head Portraits Using Convolutional
Neural Networks. ACM Transactions on Graphics 35(4):129, 2016.

Fiser et al.: Example-Based Synthesis of Stylized Facial Animations. ACM
Transactions on Graphics 36(4):155, 2017.