Abstract
This paper presents a novel unsupervised method to transfer the style of an example image to a source image. The complex notion of image style is here considered as a local texture transfer, eventually coupled with a global color transfer. For the local texture transfer, we propose a new method based on an adaptive patch partition that captures the style of the example image and preserves the structure of the source image. More precisely, this example-based partition predicts how well a source patch matches an example patch. Results on various images show that our method outperforms the most recent techniques.
Keywords
Affiliated Institutions
Related Publications
Multi-Image Matching Using Multi-Scale Oriented Patches
This paper describes a novel multi-view matching framework based on a new type of invariant feature. Our features are located at Harris corners in discrete scale-space and orien...
Image and video upscaling from local self-examples
We propose a new high-quality and efficient single-image upscaling technique that extends existing example-based super-resolution frameworks. In our approach we do not rely on a...
Learning Hierarchical Features for Scene Labeling
Scene labeling consists of labeling each pixel in an image with the category of the object it belongs to. We propose a method that uses a multiscale convolutional network traine...
Generative Adversarial Networks: An Overview
Generative adversarial networks (GANs) provide a way to learn deep representations without extensively annotated training data. They achieve this by deriving backpropagation sig...
Deep Colorization
This paper investigates into the colorization problem which converts a grayscale image to a colorful version. This is a very difficult problem and normally requires manual adjus...
Publication Info
- Year
- 2016
- Type
- preprint
- Pages
- 553-561
- Citations
- 130
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/cvpr.2016.66