Skip to content

Latest commit

 

History

History

PaintByExample

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Paint by Example: Exemplar-based Image Editing with Diffusion Models

🍇 [Official Project Page]   🍎[Official Online Demo]

Abstract

Language-guided image editing has achieved great success recently. In this paper, for the first time, we investigate exemplar-guided image editing for more precise control. We achieve this goal by leveraging self-supervised training to disentangle and re-organize the source image and the exemplar. However, the naive approach will cause obvious fusing artifacts. We carefully analyze it and propose an information bottleneck and strong augmentations to avoid the trivial solution of directly copying and pasting the exemplar image. Meanwhile, to ensure the controllability of the editing process, we design an arbitrary shape mask for the exemplar image and leverage the classifier-free guidance to increase the similarity to the exemplar image. The whole framework involves a single forward of the diffusion model without any iterative optimization. We demonstrate that our method achieves an impressive performance and enables controllable editing on in-the-wild images with high fidelity.

Table of Contents

TODO

  • PaintByExample Diffuser Demo
  • PaintByExample with SAM
  • PaintByExample with GroundingDINO
  • PaintByExample with Grounded-SAM

Installation

We're using PaintByExample with diffusers, install diffusers as follows:

pip install diffusers==0.16.1

Then install Grounded-SAM follows Grounded-SAM Installation for some extension demos.

Paint-By-Example Demos

Here we provide the demos for PaintByExample

PaintByExample Diffuser Demos

cd playground/PaintByExample
python paint_by_example.py

Notes: set cache_dir to save the pretrained weights to specific folder. The paint result will be save as paint_by_example_demo.jpg:

Input Image Mask Example Image Inpaint Result

PaintByExample with SAM

In this demo, we did inpaint task by:

  1. Generate mask by SAM with prompt (box or point)
  2. Inpaint with mask and example image
cd playground/PaintByExample
python sam_paint_by_example.py

Notes: We set a more num_inference_steps (like 200 to 500) to get higher quality image. And we've found that the mask region can influence a lot on the final result (like a panda can not be well inpainted with a region like dog). It needed to have more test on it.

Input Image SAM Output Example Image Inpaint Result