First part of the assignment is to train your own UNet from scratch, you can use the dataset and strategy provided in this linkLinks to an external site.. However, you need to train it 4 times:
MP+Tr+BCE MP+Tr+Dice Loss StrConv+Tr+BCE StrConv+Ups+Dice Loss
Uses this transform -RandomCrop 32, 32 (after padding of 4) >> FlipLR >> Followed by CutOut(16, 16)
Batch size = 512
Use SGD, and CrossEntropyLoss
Clone the project as shown below:-
$ git clone [email protected]:pankaja0285/era_v1_session18.git
About the file structure
|__session18_bce_dice_unet_segmentation.ipynb
|__session18_VAE_PL.ipynb
|__README.md
NOTE: List of libraries required:
- torch
- torchvision
- lightning-bolts
lightning-bolts - installs all the dependencies like the pytorch-lightning (1.9.5) and lightning-bolts (0.7.0) Install as
!pip install torch torchvision
!pip install lightning-bolts
One of 2 ways to run any of the notebooks, for instance Submission_CiFAR_S11_GradCam.ipynb notebook:
- Using Anaconda prompt - Run as an administrator start jupyter notebook from the folder era_v1_session11_pankaja and run it off of your localhost
NOTE: Without Admin privileges, the installs will not be correct and further import libraries will fail.
jupyter notebook
- Upload the notebook folder era_v1_session11_pankaja to google colab at colab.google.com and run it on colab
Target: - To train a UNET model from scratch and apply BCE And DICE Loss
Results:
- Total Trainable parameters: 174 M
- Train loss as low as possible
Analysis:
- To see how the tranformers work and the affects of the DICE, BCE respectively.
Target: - Using a Resnet18 encoder / decoder and see how the transformers behave
Results:
- Total parameters: 20.1 M
- Train loss as low as possible
Analysis:
- To see how the the tranformers work based on VAE
Divya GK: @gkdivya
Pankaja Shankar: @pankaja0285
For any questions, bug(even typos) and/or features requests do not hesitate to contact me or open an issue!