sleepychord / improvedgan-pytorch Goto Github PK
View Code? Open in Web Editor NEWSemi-supervised GAN in "Improved Techniques for Training GANs"
Semi-supervised GAN in "Improved Techniques for Training GANs"
Hello
You mentioned this code is inspired from the paper called "Good Semi-supervised Learning
That Requires a Bad GAN".
Have you changed any part of "Improved Techniques for Training GANs" to close to bad GANs paper?
What are the exact changes?
It is not clear for me.
Thank you
98.5% is train acc or test acc?
I am getting this error:
File "/home/elhamod/melhamodenv/AML/project2/ImprovedGAN-pytorch-master/Nets.py", line 80, in forward
x = F.softplus(self.bn1(self.fc1(x)) + self.bn1_b)
File "/home/elhamod/melhamodenv/lib/python3.6/site-packages/torch/nn/modules/module.py", line 541, in call
result = self.forward(*input, **kwargs)
File "/home/elhamod/melhamodenv/lib/python3.6/site-packages/torch/nn/modules/batchnorm.py", line 69, in forward
if self.training and self.track_running_stats:
File "/home/elhamod/melhamodenv/lib/python3.6/site-packages/torch/nn/modules/module.py", line 585, in getattr
type(self).name, name))
AttributeError: 'BatchNorm1d' object has no attribute 'track_running_stats'
Any ideas how to fix it?
when I run the command, python ImprovedGAN.py --cuda, and set the 100 epochs, the accuracy on test data is only around 65%. that is not same with your mentened 98.5%.
Hello
I tried to feed cifar10, I change the input size to 3072, however in run time I still get
(0): LinearWeightNorm(in_features=784, out_features=1000, weight_scale=1)
(1): LinearWeightNorm(in_features=1000, out_features=500, weight_scale=1)
(2): LinearWeightNorm(in_features=500, out_features=250, weight_scale=1)
(3): LinearWeightNorm(in_features=250, out_features=250, weight_scale=1)
(4): LinearWeightNorm(in_features=250, out_features=250, weight_scale=1)
for first layer it is still 784
ImprovedGAN-pytorch/ImprovedGAN.py
Line 77 in bc14e80
when i run ImprovedGAN.py. An error is raised:
AttributeError: 'TensorDataset' object has no attribute 'data_tensor'
how can i solve this probleam?
Dear author, thanks for sharing the improved GAN in pytorch. In readme.md, you said: Default configs can train models achieving 98.5% accuracy on test dataset with 100 labeled data(10 per class) and other 59,000 unlabeled data after 100 epochs.
However, according to your implementation, the unlabeled samples are selected as the full train samples of MNIST. It seems they are inconsistent. Which is correct?
when I run this code, the partial results are as follows.
Iteration 8, loss_supervised = 0.5682, loss_unsupervised = 0.0012, loss_gen = 1.8700 train acc = 0.7541
Eval: correct 6277/ 10000
.......
Iteration 34, loss_supervised = 0.8746, loss_unsupervised = 0.0009, loss_gen = 8.1296 train acc = 0.5644
Eval: correct 8050/ 1800
.......
Iteration 48, loss_supervised = 0.7172, loss_unsupervised = 0.0002, loss_gen = 7.7574 train acc = 0.6869
Eval: correct 8056/ 1800
........
then the results have few change.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.