Giter Site home page Giter Site logo

2251821381 / self-prompt-tuning Goto Github PK

View Code? Open in Web Editor NEW

This project forked from wangyz1608/self-prompt-tuning

0.0 0.0 0.0 2.15 MB

The official implementation for paper: Revisting the Power of Prompt for Visual Tuning.

Home Page: https://wangyz1608.github.io/Self-Prompt-Tuning/

Python 100.00%

self-prompt-tuning's Introduction

Self-Prompt-Tuning: A PyTorch Implementation

This is a PyTorch implementation of the paper: Revisiting the Power of Prompt for Visual Tuning.

  • The implementation mainly consists of three parts: image preprocessing, loading of pretrained models, and training scripts.

  • The implementation also shows some training curves (tensorboard).

0. Environments

  • PyTorch 1.10.0
  • torchvision 0.11.0
  • timm 0.6.0

1. Datasets and Pre-trained Model Preperation

  • Dataset

    • See Table S1 in the Appendix for dataset details.

    • Fine-Grained Visual Classification tasks (FGVC). The datasets can be downloaded following the official links.

    • Visual Task Adaptation Benchmark (VTAB-1K). Downloading these data sets is troublesome. It is recommended to refer to existing repo, such as VPT.

    • The file folder ./Dataset implements image reading and preprocessing for FGVC and VTAB-1K.

  • Pre-trained Model

    • The following table provides the pre-trained checkpoints used in the paper.

      Pre-trained Objective Link
      MAE download
      MoCo v3 download
      Supervised timm
    • For supervised pre-training models, timm can avoid the trouble of weight format conversion (npz --> pth).

  • Val-set curves and recipes

    • The task datasets, tuned model weights, some val-set accuracy curves, and train recipes in link.

2. MAE Pre-training

# for instance, transfer to CUB-200 via SPT-deep variant.
torchrun --nproc_per_node=2 \
         train_MAE.py \
         --model_name MAE_vpt_deep_vit_b \
         --finetune ${PRETRAIN_CKPT} \
         --drop_path 0.0 \
         --dataset CUB200 \
         --tuning_type "prompt" \
         --num_prompts 20 \
         --prompt_deep \
         --prompt_sample_type "random" \
         --epochs 100 \
         --batch_size 32 \
         --weight_decay 1e-2 \
         --wd_head 0.5 \
         --lr 5e-2 \
         --min_lr 1e-8 \
         --warmup_epochs 10 \
         --model_ema \
         --save_dir ${SAVE_PATH}
  • num_prompts is 20 for deep, and 100 for shallow.

  • prompt_sample_type include 'kmeans', 'mean-pooling', 'max-pooling', and 'random'. For kmeans, the cluster center needs to be calculated in advance.

  • Turn off prompt_deep to implement the shallow variant.

  • The training scripts for other tasks are similar, please refer to ./script/MAE-IN1K.

3. MoCo and Supervised Pre-training

# for instance, transfer to dogs-120 via SPT-deep variant.
torchrun --nproc_per_node=2 \
         train.py \
         --model_name vpt_deep_vit_b \
         --drop_path 0.0 \
         --dataset DOG120 \
         --tuning_type "prompt" \
         --num_prompts 20 \
         --prompt_deep \
         --prompt_sample_type "random" \
         --epochs 100 \
         --batch_size 32 \
         --weight_decay 1e-2 \
         --wd_head 3.0 \
         --lr 1e-3 \
         --min_lr 1e-8 \
         --warmup_epochs 10 \
         --model_ema \
         --save_dir ${SAVE_PATH}
  • The script is similar to MAE pre-training.

  • For supervised pre-trained model, we load the pre-trained weights by timm, e.g., pretrain_model = timm.create_model('vit_base_patch16_224_in21k', pretrained=True)

  • For other tasks, please refer to ./script/MoCo-IN1K and ./script/sup-IN21K.

4. some training curves

  • We provide some training curves for reference. (MAE pre-training, ViT-B as backbone, plot the test accuracy curve.)

    • CUB-200
    • Caltech101
    • Patch_Camelyon
  • The accuracy fluctuates within a reasonable range. For example, we get 80.26, 80.39, 79.74 test accuracy when we fine-tuning MAE pre-trained ViT-B on CUB-200 three times with varying random seeds.

5. Citation

@inproceedings{
    wang2024revisiting,
    title={Revisiting the Power of Prompt for Visual Tuning},
    author={Yuzhu Wang and Lechao Cheng and Chaowei Fang and Dingwen Zhang and Manni Duan and Meng Wang},
    booktitle={Forty-first International Conference on Machine Learning},
    year={2024},
}

self-prompt-tuning's People

Contributors

wangyz1608 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.