Non-stationary texture synthesis by adversarial expansion

Yang Zhou, Zhen Zhu, Xiang Bai, Dani Lischinski, Daniel Cohen-Or, Hui Huang

Research output: Contribution to journalArticlepeer-review

120 Scopus citations

Abstract

The real world exhibits an abundance of non-stationary textures. Examples include textures with large scale structures, as well as spatially variant and inhomogeneous textures. While existing example-based texture synthesis methods can cope well with stationary textures, non-stationary textures still pose a considerable challenge, which remains unresolved. In this paper, we propose a new approach for example-based non-stationary texture synthesis. Our approach uses a generative adversarial network (GAN), trained to double the spatial extent of texture blocks extracted from a specific texture exemplar. Once trained, the fully convolutional generator is able to expand the size of the entire exemplar, as well as of any of its sub-blocks. We demonstrate that this conceptually simple approach is highly effective for capturing large scale structures, as well as other non-stationary attributes of the input exemplar. As a result, it can cope with challenging textures, which, to our knowledge, no other existing method can handle.

Original languageEnglish
Article numberA10
JournalACM Transactions on Graphics
Volume37
Issue number4
DOIs
StatePublished - 2018

Bibliographical note

Publisher Copyright:
© 2018 Association for Computing Machinery.

Keywords

  • Example-based texture synthesis
  • generative adversarial networks
  • nonstationary textures

Fingerprint

Dive into the research topics of 'Non-stationary texture synthesis by adversarial expansion'. Together they form a unique fingerprint.

Cite this