Comments (3)
Yes, we found using the current order gives a higher accuracy typically. The only difference between two orders in DenseNet is that, the first BN layer has scaling and shifting parameters which provide later layers different activation scales. If we use CONV first, the convolutions are forced to receive the same activations in different subsequent layers, which may not be a good thing for training.
from densenet.
@liuzhuang13 Thanks for the response! Excellent insight, so if I understand correctly:
- You perform BN first because, given that it might learn different parameters, the ReLU activation might be different for each layers, meaning that each layer will see different versions of the same features.
- Another way of seen it is that you delegate the BN + Activation to the upper layers.
I think you could mention this more in the paper. Reading it more closely you do reference the Microsoft paper but don't comment about it.
Thanks again!
from densenet.
If by "upper layers" you mean "deeper layers (layers farther from input)", I think we understand it in the same way. Thanks for the suggestion! If there's a newer version of the paper we'll consider mentioning this more.
from densenet.
Related Issues (20)
- Covolution before entering the first dense block for imagenet dataset HOT 1
- DenseNet on Pascal VOC HOT 2
- results on cifar100 HOT 1
- I tried to reproduce Wide-DenseNet-BC results on cifar10, but got 0.5% more than your error HOT 4
- Why is composite function BN-ReLU-Conv3x3 ? HOT 1
- Pretrained weights for the 0.8M parameters config HOT 1
- Why not share the first BN and ReLU? HOT 2
- The layers within the second and third dense block don't assign the least weight to the outputs of the transition layer in my trained model
- Why we can detach any layer without affecting others in densenet?
- question about standardization HOT 6
- cifar validation loss decrease than increase after learning rate change HOT 4
- Question on channel before entering the first block HOT 2
- Question on impede information flow HOT 1
- Is there a pretrained CIFAR 100 or CIFAR 10 model? HOT 2
- Densenet on CIFAR training from scratch
- Question on the last transition layer
- Receptive field of DenseNet
- image classification
- cannot open </cifar-10-python/data_batch_1>
- Different DensceNet
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from densenet.