ReCo extends T2I models to understand coordinate inputs. Thanks to the introduced position tokens in the region-controlled input query, users can easily specify free-form regional descriptions in arbitrary image regions. For more details, please refer to our paper.
@inproceedings{yang2023reco,
title={ReCo: Region-Controlled Text-to-Image Generation},
author={Yang, Zhengyuan and Wang, Jianfeng and Gan, Zhe and Li, Linjie and Lin, Kevin and Wu, Chenfei and Duan, Nan and Liu, Zicheng and Liu, Ce and Zeng, Michael and Wang, Lijuan},
booktitle={CVPR},
year={2023}
}
Clone the repository:
git clone https://github.com/microsoft/ReCo.git
cd ReCo
A conda environment named reco_env
can be created and activated with:
conda env create -f environment.yaml
conda activate reco_env
Or install packages in requirements.txt
:
pip install -r requirements.txt
We recommend using the following AzCopy command to download. AzCopy executable tools can be downloaded here.
Example command:
path/to/azcopy copy <folder-link> <target-address> --resursive"
# For example:
path/to/azcopy copy https://unitab.blob.core.windows.net/data/reco/dataset <local_path> --recursive
Download processed dataset annotations dataset
folder in the following dataset path (~59G) with azcopy tool.
path/to/azcopy copy https://unitab.blob.core.windows.net/data/reco/dataset <local_path> --recursive
ReCo checkpoints trained on COCO and a small LAION subset can be downloaded via wget or AzCopy here ReCo_COCO and ReCo_LAION. Save downloaded weights to logs
.
inference.sh
contains examples for inference calls
eval.sh
contains examples for coco evaluation.
For ReCo fine-tuning, we start with the stable diffusion model with instructions here. Weights can be downloaded on HuggingFace. The experiments mainly use sd-v1-4-full-ema.ckpt
.
train.sh
contains examples for fine-tuning.
The project is built based on the following repository:
Download Details:
Author: Microsoft
Official Github: https://github.com/microsoft/ReCo
License: MIT