The implementation of the conception compresssion framework proposed in the paper Layered Conceptual Image Compression Via Deep Semantic Synthesis (ICIP 2019).
If you find it useful for your research, please cite as following:
@inproceedings{chang2019layered, title={Layered Conceptual Image Compression Via Deep Semantic Synthesis}, author={Chang, Jianhui and Mao, Qi and Zhao, Zhenghui and Wang, Shanshe and Wang, Shiqi and Zhu, Hong and Ma, Siwei}, booktitle={2019 IEEE International Conference on Image Processing (ICIP)}, pages={694--698}, year={2019}, organization={IEEE} }
The pipeline of the proposed framework is shown below.
The proposed method demonstrates better perception quality under lower bit rate than traditional algorithms such as JPEG\JPEG2000\HM. (note that the dimension of latent codes is 8 here).
python >= 3.6
pytorch >= 1.14
numpy >= 1.18
pillow >= 6.2.0
dominate >= 2.5
visdom >=0.1.8
tqdm >= 4.36.1
Prepare paired data(image-edge pairs): Training datasets include the following datasets, please download one of the following training datasets, and unzip the files.
- edges2handbags: http://efrosgans.eecs.berkeley.edu/pix2pix/datasets/edges2handbags.tar.gz
- edges2shoes: http://efrosgans.eecs.berkeley.edu/pix2pix/datasets/edges2shoes.tar.gz
You can follow the examples in ./scripts/train.sh.
python train.py --dataroot=[path of training data] --phase=[name of datasets: train/val/test] --nz=[dimension of latent codes]
Please follow the examples in ./scripts/test.sh.
python test.py --dataroot=[path of testing data] --results_dir=[path where you put the results images] --checkpoints_dir=[checkpoints_dir] --no_flip --epoch=[name of checkpoints: default latest]
You can find the pretrained model which is trained 80 epochs with combined dataset of edges2shoes and edges2handbags. The dimension of texture latent codes is set to 64.
- Pretrained model: https://disk.pku.edu.cn:443/link/07FC94DAC259C9AEC85ADEC747A8BC3E
There are some reconstruction examples of the provided pretrained model.
- BicycleGAN [github page]