Jin Dong: [email protected]; Liheng Ma: [email protected].
This is the assignment codebase for the IFT-6135: Learning Representation (Deep Learning) class taught by Prof Aaron Courville at UdeM.
- Implement MLP with vectorized back-propogation using Numpy.
- Implement a standard CNN for MNIST.
- Implement an improved CNN for MNIST on Kaggle using DL library (we used MxNet).
- Implement a RNN, GRU Cell using PyTorch.
- Implement the multi-head self-attention module of Transformer.
- Do experiments on various setting.
- Build a discriminator to approximate Wasserstein Distance, Jensen–Shannon divergence, etc.
- Build a Variational Auto-encoder with binary-cross-entropy loss to generate binarized MNIST images.
- Build a Variational Auto-encoder with MSE loss to generate SVHN images.
- Build a Wasserstein GAN with gradient-penalty to generate SVHN images.
- Qualitative and quantative analyze the performance of VAE and GAN.