Comments (8)
Hmm. I have never actually tested sparseconvnet.Convolution with 1x1 filters.
For 1x1 convolutions, the sparsity pattern does not change, so it is better to
use 'sparseconvnet.SubmanifoldConvolution' with size 1x1
or (even better) 'sparseconvnet.NetworkInNetwork' (equivalent to the above but optimized for size 1x1).
See for example
from sparseconvnet.
Thank you. : )
from sparseconvnet.
Hello, @btgraham. I now have a ten-layer standard convolutional network, and the input is indeed sparse (only about 5% of the input is active; the input size is Nx7x200x400). However I found the pytorch implementation is several times faster than the sparseconvnet implementation. The latter has iterative SubmanifoldConvolution and BatchNormReLU layers. The GPU utilization is about 95% in both cases.
Is the performance gap normal (since pytorch uses high optimized cudnn libraries) or I have done something wrong?
from sparseconvnet.
It will depend on the network, and how the sparsity changes through the network (due to convolutions, pooling, etc). I can take a look at the network, and a typical input, if you show me.
SparseConvNet is not as heavily optimised as PyTorch/CUDNN kernels, so for training relatively small networks running on the GPU, dense convolutions may be faster. Sparsity is more important in 3D, or when running on the CPU/low power hardware.
from sparseconvnet.
The network is not deep, and does not change the sparsity pattern a lot. I have profiled the program, and it seems the convolution backward kernels are much slower than the pytorch kernels.
Can you please give me you email? I'll send you my code. I have cleaned and simplified the code, and it should be easy to run by "python main.py".
from sparseconvnet.
from sparseconvnet.
@LaiSongxuan hello, may I know some details about your pytorch implementation that several times faster than sparseconvnet implementation?
You just use some combination of torch.nn.conv2d or torch.nn.conv3d layers in your pytorch implementation?
from sparseconvnet.
@bigsheep2012 Right. I use iterative layers of nn.conv2d, nn.Batchnorm2d and nn.RELU.
from sparseconvnet.
Related Issues (20)
- Some questions about operational efficiency
- About the parameters of InputLayer HOT 3
- Dense to Sparse for input is quite slow
- AttributeError: module 'sparseconvnet.SCN' has no attribute 'Metadata_2' HOT 3
- RuntimeError: CUDA error: an illegal memory access was encountered HOT 2
- voxel input HOT 1
- Cloning face an error
- undefined symbol: _ZNSt15__exception_ptr13exception_ptr10_M_releaseEv HOT 2
- How to compute FLOPs for spraseconvnet HOT 3
- RuntimeError: expected scalar type Long but found Float HOT 5
- Building failure related to gcc version
- RuntimeError: expected scalar type Long but found Float
- Dilated convolution HOT 1
- Directly applying convolution HOT 1
- Rewrite for convolution operation
- Output with empty tensor
- setup.py中的C++17要改成C++14
- question from paper HOT 2
- Segment Fault due to resolution
- RuntimeError: Error compiling objects for extension
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from sparseconvnet.