Giter Site home page Giter Site logo

flare22's Introduction

Solution of Team Blackbean for FLARE22 Challenge

Revisiting nnU-Net for Iterative Pseudo Labeling and Efficient Sliding Window Inference
Ziyan Huang, Haoyu Wang, Jin Ye, Jingqi Niu, Can Tu, Yuncheng Yang, Shiyi Du, Zhongying Deng, Lixu Gu, and Junjun He \

Built upon MIC-DKFZ/nnUNet, this repository provides the solution of team blackbean for MICCAI FLARE22 Challenge. The details of our method are described in our paper.

我们方法的中文介绍在知乎 https://zhuanlan.zhihu.com/p/657611778

Our trained model is available at Our trained model is available at RESULTS_FOLDER

You can get our docker by

docker pull miccaiflare/blackbean:latest

You can reproduce our method as follows step by step:

Environments and Requirements:

Install nnU-Net [1] as below. You should meet the requirements of nnUNet, our method does not need any additional requirements. For more details, please refer to https://github.com/MIC-DKFZ/nnUNet

git clone https://github.com/MIC-DKFZ/nnUNet.git
cd nnUNet
pip install -e .

1. Training Big nnUNet for Pseudo Labeling

1.1. Copy the following files in this repo to your nnUNet environment.

FLARE22/nnunet/training/network_training/nnUNetTrainerV2_FLARE.py
FLARE22/nnunet/experiment_planning/experiment_planner_FLARE22Big.py

1.2. Prepare 50 Labeled Data of FLARE

Following nnUNet, give a TaskID (e.g. Task022) to the 50 labeled data and organize them folowing the requirement of nnUNet.

nnUNet_raw_data_base/nnUNet_raw_data/Task022_FLARE22/
├── dataset.json
├── imagesTr
├── imagesTs
└── labelsTr

1.3. Conduct automatic preprocessing using nnUNet.

Here we do not use the default setting.

nnUNet_plan_and_preprocess -t 22 -pl3d ExperimentPlanner3D_FLARE22Big -pl2d None

1.4. Training Big nnUNet by 5-fold Cross Validation

for FOLD in 0 1 2 3 4
do
nnUNet_train 3d_fullres nnUNetTrainerV2_FLARE_Big 22 $FOLD -p nnUNetPlansFLARE22Big
done

1.5. Generate Pseudo Labels for 2000 Unlabeled Data

nnUNet_predict -i INPUTS_FOLDER -o OUTPUTS_FOLDER  -t 22  -tr nnUNetTrainerV2_FLARE_Big  -m 3d_fullres  -p nnUNetPlansFLARE22Big  --all_in_gpu True 

1.6. Iteratively Train Models and Generate Pseudo Labels

  • Give a new TaskID (e.g. Task023) and organize the 50 Labeled Data and 2000 Pseudo Labeled Data as above.
  • Conduct automatic preprocessing using nnUNet as above.
    nnUNet_plan_and_preprocess -t 23 -pl3d ExperimentPlanner3D_FLARE22Big -pl2d None
    
  • Training new big nnUNet by all training data instead of 5-fold.
    nnUNet_train 3d_fullres nnUNetTrainerV2_FLARE_Big 23 all -p nnUNetPlansFLARE22Big
    
  • Generate new pseudo labels for 2000 unlabeled data.

2. Filter Low-quality Pseudo Labels

We compare Pseudo Labels in different rounds and filter out the labels with high variants.

select.ipynb

3. Train Small nnUNet

3.1. Copy the following files in this repo to your nnUNet environment.

FLARE22/nnunet/training/network_training/nnUNetTrainerV2_FLARE.py
FLARE22/nnunet/experiment_planning/experiment_planner_FLARE22Small.py

3.2. Prepare 50 Labeled Data and 1924 Selected Pseudo Labeled Data of FLARE

Give a new TaskID (e.g. Task026) and organize the 50 Labeled Data and 1924 Pseudo Labeled Data as above.

3.3. Conduct automatic preprocessing using nnUNet

Here we use the plan designed for small nnUNet.

nnUNet_plan_and_preprocess -t 26 -pl3d ExperimentPlanner3D_FLARE22Small -pl2d None

3.4. Train small nnUNet on all training data

nnUNet_train 3d_fullres nnUNetTrainerV2_FLARE_Small 26 all -p nnUNetPlansFLARE22Small

4. Do Efficient Inference with Small nnUNet

We modify a lot of parts of nnunet source code for efficiency. Please make sure the code backup is done and then copy the whole repo to your nnunet environment.

nnUNet_predict -i INPUT_FOLDER  -o OUTPUT_FOLDER  -t 26  -p nnUNetPlansFLARE22Small   -m 3d_fullres \
 -tr nnUNetTrainerV2_FLARE_Small  -f all  --mode fastest --disable_tta

Citations

If you find this repository useful, please consider citing our paper:

@incollection{huang2023revisiting,
  title={Revisiting nnU-Net for Iterative Pseudo Labeling and Efficient Sliding Window Inference},
  author={Huang, Ziyan and Wang, Haoyu and Ye, Jin and Niu, Jingqi and Tu, Can and Yang, Yuncheng and Du, Shiyi and Deng, Zhongying and Gu, Lixu and He, Junjun},
  booktitle={Fast and Low-Resource Semi-supervised Abdominal Organ Segmentation: MICCAI 2022 Challenge, FLARE 2022, Held in Conjunction with MICCAI 2022, Singapore, September 22, 2022, Proceedings},
  pages={178--189},
  year={2023},
  publisher={Springer}
}

flare22's People

Contributors

blueyo0 avatar ziyan-huang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

flare22's Issues

About code on nnUNetv2

Thanks very much for your excellent work. Have you tried updating the code to be compatible with nnUNetv2?
Since nnUNetv2 did a massive merge or follow up of the previous code, I'm not sure if your work will work for the new release

关于Grand Challenge上MSD验证集提交问题

大佬您好:
我是华中科技大学计算机专业的一名本科生,很抱歉以这种形式联系您。
我正在MSD上提交验证集,但是出现了这样的问题:
evalutils.exceptions.FileLoaderError: Could not load and files in /input/Task01_BrainTumour with <evalutils.io.SimpleITKLoader object at 0x7fedc5ae6cf8>.
可是,我提交的预测标签都是nii.gz的文件,还是出现上述错误,不知道是什么原因。
由于您在提交榜单上有靠前排名,所以想请教您,您提交的文件是如何保存提交的。还望您能够予以指教,不胜感激。

2023年4月5日

Pseudo label selection

Based on the code provided in https://github.com/Ziyan-Huang/FLARE22/blob/main/select.ipynb, it appears that three rounds of pseudo-labels are generated. However, the instructions imply that only the first round (involving the model trained with labeled data) and the second round (involving the model trained with the first round of pseudo-labels and labeled data) produce pseudo-label results. Therefore, the process for conducting the third round of iterative training remains unclear. Could you please clarify how to proceed with the third round of training in this iterative process?

Pre-trained model

Thanks very much for your excellent work.
Can you provide the pre-trained model? I want to apply the model on my data without label for downstream task.
Much appreciate.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.