Giter Site home page Giter Site logo

era_v1_session18's Introduction

Purpose: Use Residual block and create a CNN model for training on CiFAR 10 dataset.

Based on CiFAR 10 and MNIST datasets

Details:-

First part of the assignment is to train your own UNet from scratch, you can use the dataset and strategy provided in this linkLinks to an external site.. However, you need to train it 4 times:

MP+Tr+BCE MP+Tr+Dice Loss StrConv+Tr+BCE StrConv+Ups+Dice Loss

Uses this transform -RandomCrop 32, 32 (after padding of 4) >> FlipLR >> Followed by CutOut(16, 16)
Batch size = 512
Use SGD, and CrossEntropyLoss

Project Setup:

Clone the project as shown below:-

$ git clone [email protected]:pankaja0285/era_v1_session18.git

About the file structure
|__session18_bce_dice_unet_segmentation.ipynb |__session18_VAE_PL.ipynb
|__README.md

NOTE: List of libraries required:

  • torch
  • torchvision
  • lightning-bolts
    lightning-bolts - installs all the dependencies like the pytorch-lightning (1.9.5) and lightning-bolts (0.7.0) Install as

!pip install torch torchvision
!pip install lightning-bolts

One of 2 ways to run any of the notebooks, for instance Submission_CiFAR_S11_GradCam.ipynb notebook:

  1. Using Anaconda prompt - Run as an administrator start jupyter notebook from the folder era_v1_session11_pankaja and run it off of your localhost
    NOTE: Without Admin privileges, the installs will not be correct and further import libraries will fail.
jupyter notebook
  1. Upload the notebook folder era_v1_session11_pankaja to google colab at colab.google.com and run it on colab

In session18_bce_dice_unet_segmentation.ipynb - Use case With RandomCrop ONLY:

Target: - To train a UNET model from scratch and apply BCE And DICE Loss

Results:

  • Total Trainable parameters: 174 M
  • Train loss as low as possible

Analysis:

  • To see how the tranformers work and the affects of the DICE, BCE respectively.

In session18_VAE_PL.ipynb - Here Variational Auto Encoders is implemented

Target: - Using a Resnet18 encoder / decoder and see how the transformers behave

Results:

  • Total parameters: 20.1 M
  • Train loss as low as possible

Analysis:

  • To see how the the tranformers work based on VAE

Contributing: This is a combined effort by

Divya GK: @gkdivya
Pankaja Shankar: @pankaja0285

For any questions, bug(even typos) and/or features requests do not hesitate to contact me or open an issue!

era_v1_session18's People

Contributors

pankaja0285 avatar

Watchers

gkdivya avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.