m-nauta / pipnet Goto Github PK
View Code? Open in Web Editor NEWPIP-Net: Patch-based Intuitive Prototypes Network for Interpretable Image Classification (CVPR 2023)
PIP-Net: Patch-based Intuitive Prototypes Network for Interpretable Image Classification (CVPR 2023)
Hello and thanks for your fascinating work,
Would you mind uploading a resnet50 model (or even better one for Cars and one for CUB) or/and providing the arguments needed to reproduce the results on that architecture? Or are resnet50 parameters equal, except for the mentioned differences in the paper?
Greetings
Hello @M-Nauta!
Thanks for providing the implementation code for the PIPNet paper. It seems currently that the repository has no LICENSE associated with this implementation. Also, I observe that some scripts like features/resnet_features.py
and preprocess_data/cub.py
from your ProtoTree repository are also present here, where MIT License was used. But, I am not sure whether you intended this repository code to be released under some different license.
So, I would like to know the current license for this repository code. Thanks again, for your help in advance!
Hey,
I'm trying to train a PIPNet on CUB following the README.md
. I prepared the CUB dataset directory using the code from ProtoTree, which resulted in a dataset
directory (inside the CUB root directory), which then contains the 4 directories:
test_crop
test_full
train_corners
train_crop
When running main.py
, I get an error on line 161 in data.py
:
FileNotFoundError: [Errno 2] No such file or directory: '/fastdata/MT_ExplSeg/datasets/CUB_200_2011/dataset/train'
This is because in line 21 of data.py
we define the 3rd argument of get_birds
as './data/CUB_200_2011/dataset/train'
My Question: Am I to rename the directory train_corners
to train
, or did I miss a step here?
Thanks in advance!
Hey,
I tried training PIPNet on the CUB dataset. I followed the README.md
and used the default arguments. The only changes I made are reducing batch size and learning rate, since I only have 12 GB of VRAM available. Specifically, I set
args.batch_size = 8
args.batch_size_pretrain = 16
args.lr = 0.05 / 8
args.lr_block = 0.0005 / 8
args.lr_net = 0.0005 / 8
which is a reduction by a factor of 8 for all arguments.
The rest of the code is unchanged, and I pulled the recent version with the new CUB preprocessing. The code executes without any errors.
Looking at log_epoch_overview.csv
, I can see the test_top1_acc
dropping from 0.023 in epoch 1 to 0.005 in epoch 12 and stagnating afterward. mean_train_loss_during_epoch
stays constant at around 11 +/- 1 for all epochs.
I also tested the code on different GPUs and tried different learning rates, but the problem remains. Since I only applied the changes mentioned above, I'm very confused why I can't reach any learning.
Do you have any idea, what the problem might be here? And can you recommend further ways to debug?
Also, would it be possible to provide a pretrained model (or maybe even the entire run directory for a pretrained model)?
Thanks a lot in advance!
Best
Max
Hi Meike,
after working with your code, I might have found a critical bug during training/testing.
As far as I understood, the classification weights close to 0 should be clamped to be at least 1e-3. This is done in main.py:236, train.py:91 and test.py:49.
The problem here is, that even if weights are bigger than 1e-3, they will get reduced by 1e-3. Since this is done in the loops over the train/test loaders, the weights will be reduced many times, obviously leading to wrong classification results.
I especially noticed the effects in the test.py as my accuracy dropped dramatically. I'm currently extending PIPNet to segmentation, and my mIoUs dropped to near chance. I haven't looked into the effects for your default PIPNet configuration for classification.
Hope this helps!
Best
Max
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.