Explaining the Implicit Neural Canvas (XINC): Connecting Pixels to Neurons by Tracing their Contributions
This is the official implementation of the paper "Explaining the Implicit Neural Canvas: Connecting Pixels to Neurons by Tracing their Contributions"
This README provides information on the scripts for constructing neuron contribution maps, training models and analysis. For MLP-based INR analysis, we use a simple implementation of the FFN network and for CNN-based INR analysis, we build upon code from the HNeRV repository.
- Clone this repository and navigate to it in your terminal.
- Install python==3.9
pip install -r requirements.txt
The script image_inr/get_mlp_mappings.py
provides functions for computing the contribution maps of FFN Layer 3, Layer 2 and Layer 1 neurons. It uses the intermediate results that are gathered from the FFN model class (found in image_inr/models/ffn.py
).
The script HNeRV/get_mappings.py
provides functions for computing the contribution maps for all NeRV head layer and Block 3 neurons. It uses the intermediate network results that are gathered from the HNeRV model class (found in HNeRV/model_all.py
).
To use the dataloaders, please download and place the panoptic video datasets in a folder called data/
at the root of the code folder. Download the Cityscapes-VPS dataset into data/cityscapes_vps
and VIPSeg dataset into data/VIPSeg-Dataset
by following the instructions on their respective websites.
CityscapesVPS and VIPSeg single image dataset definitions (with panoptic annotations) can be found in image_inr/image_vps_datasets.py
.
CityscapesVPS, VIPSeg video dataset definitions (with panoptic annotations) and Flow-based dataset definition can be found in HNeRV/vps_datasets.py
.
MLP training framework (allows training various types of MLP-based INRs including FFN) - image_inr/main.py
. The associated config files can be found in image_inr/configs.
Example command
python main.py data.data_shape=[128,256] data.data_path=<path to image file> network=ffn network.layer_size=104 trainer.num_iters=1000 logging.checkpoint.logdir=<path to save folder>
The original NeRV/HNeRV training script with modifications necessary for INR analysis experiments - HNeRV/train_nerv_all.py
.
Example command
python train_nerv_all.py --outf <path to save folder> --data_path <path to video> --conv_type convnext pshuffel --act gelu --norm none --crop_list 640_1280 --resize_list 128_256 --embed pe_1.25_80 --fc_hw 8_16 --dec_strds 4 2 2 --ks 0_3_3 --reduce 1.2 --skip_mssim --modelsize 1.0 -e <num_epochs> --lower_width 6 -b 2 --lr <lr>
The MLP INR analysis iPython notebooks and scripts can be found under image_inr/analysis
. The CNN INR analysis code is contained in iPython notebooks and scripts in the HNeRV/analysis
directory.
If you find our work useful, please consider citing:
@misc{padmanabhan2024explaining,
title={Explaining the Implicit Neural Canvas: Connecting Pixels to Neurons by Tracing their Contributions},
author={Namitha Padmanabhan and Matthew Gwilliam and Pulkit Kumar and Shishira R Maiya and Max Ehrlich and Abhinav Shrivastava},
year={2024},
eprint={2401.10217},
archivePrefix={arXiv},
primaryClass={cs.CV}
}