Giter Site home page Giter Site logo

chelseas / alpha-beta-crown Goto Github PK

View Code? Open in Web Editor NEW

This project forked from verified-intelligence/alpha-beta-crown

0.0 1.0 0.0 69.69 MB

alpha-beta-CROWN: An Efficient, Scalable and GPU Accelerated Neural Network Verifier (winner of VNN-COMP 2021 and 2022)

License: BSD 3-Clause "New" or "Revised" License

Shell 0.46% C++ 1.22% Python 98.22% Cuda 0.08% Makefile 0.02%

alpha-beta-crown's Introduction

α,β-CROWN (alpha-beta-CROWN): A Fast and Scalable Neural Network Verifier with Efficient Bound Propagation

α,β-CROWN (alpha-beta-CROWN) is a neural network verifier based on an efficient linear bound propagation framework and branch and bound. It can be accelerated efficiently on GPUs and can scale to relatively large convolutional networks (e.g., millions of parameters). It also supports a wide range of neural network architectures (e.g., CNN, ResNet, and various activation functions), thanks to the versatile auto_LiRPA library developed by us. α,β-CROWN can provide provable robustness guarantees against adversarial attacks and can also verify other general properties of neural networks.

α,β-CROWN is the winning verifier in VNN-COMP 2021 and VNN-COMP 2022 (International Verification of Neural Networks Competition) with the highest total score, outperforming many other neural network verifiers on a wide range of benchmarks over 2 years. Details of competition results can be found in VNN-COMP 2021 slides, report and VNN-COMP 2022 slides (page 73).

α,β-CROWN combines our efforts in neural network verification in a series of papers building up the bound propagation framework during the past five years. See the Publications section below.

Supported Features

Our verifier consists of the following core algorithms:

We support these neural network architectures:

  • Layers: fully connected (FC), convolutional (CNN), pooling (average pool and max pool), transposed convolution
  • Activation functions: ReLU (incomplete/complete verification); sigmoid, tanh, arctan, sin, cos, tan (incomplete verification)
  • Residual connections and other irregular graphs

We support the following verification specifications:

  • Lp norm perturbation (p=1,2,infinity, as often used in robustness verification)
  • VNNLIB format input (at most two layers of AND/OR clause, as used in VNN-COMP 2021 and 2022)
  • Any linear specifications on neural network output (which can be added as a linear layer)

We provide many example configurations in complete_verifier/exp_configs directory to start with:

  • MNIST: MLP and CNN models
  • CIFAR-10, CIFAR-100, TinyImageNet: CNN and ResNet models
  • ACASXu, NN4sys and other low input-dimension models

See the Guide on Algorithm Selection to find the most suitable example to get started.

Installation and Setup

α,β-CROWN is tested on Python 3.7+ and PyTorch 1.11. It can be installed easily into a conda environment. If you don't have conda, you can install miniconda.

# Remove the old environment, if necessary.
conda deactivate; conda env remove --name alpha-beta-crown
# install all dependents into the alpha-beta-crown environment
conda env create -f complete_verifier/environment.yml --name alpha-beta-crown
# activate the environment
conda activate alpha-beta-crown

If you use the α-CROWN and/or β-CROWN verifiers (which covers the most use cases), a Gurobi license is not needed. If you want to use MIP based verification algorithms (feasible only for small MLP models), you need to install a Gurobi license with the grbgetkey command. If you don't have access to a license, by default the above installation procedure includes a free and restricted license, which is actually sufficient for many relatively small NNs. If you use the GCP-CROWN verifier, an installation of IBM CPlex solver is required. Instructions to install the CPlex solver can be found in the VNN-COMP benchmark instructions or the GCP-CROWN instructions.

If you prefer to install packages manually rather than using a prepared conda environment, you can refer to this installation script.

If you want to run α,β-CROWN verifier on the VNN-COMP 2021 and 2022 benchmarks (e.g., to make a comparison to a new verifier), you can follow this guide.

Instructions

We provide a unified front-end for the verifier, abcrown.py. All parameters for the verifier are defined in a yaml config file. For example, to run robustness verification on a CIFAR-10 ResNet network, you just run:

conda activate alpha-beta-crown  # activate the conda environment
cd complete_verifier
python abcrown.py --config exp_configs/cifar_resnet_2b.yaml

You can find explanations for most useful parameters in this example config file. For detailed usage and tutorial examples please see the Usage Documentation. We also provide a large range of examples in the complete_verifier/exp_configs folder.

Publications

If you use our verifier in your work, please kindly cite our CROWN(Zhang et al., 2018), α-CROWN (Xu et al., 2021), β-CROWN(Wang et al., 2021) and GCP-CROWN(Zhang et al., 2022) papers. If your work involves the convex relaxation of the NN verification please kindly cite Salman et al., 2019. If your work deals with ResNet/DenseNet, LSTM (recurrent networks), Transformer or other complex architectures, or model weight perturbations please kindly cite Xu et al., 2020. If you use our branch and bound based adversarial attack (falsifier), please cite Zhang et al. 2022.

α,β-CROWN combines our existing efforts on neural network verification:

  • CROWN (Zhang et al. NeurIPS 2018) is a very efficient bound propagation based verification algorithm. CROWN propagates a linear inequality backwards through the network and utilizes linear bounds to relax activation functions.

  • The "convex relaxation barrier" (Salman et al., NeurIPS 2019) paper concludes that optimizing the ReLU relaxation allows CROWN (referred to as a "greedy" primal space solver) to achieve the same solution as linear programming (LP) based verifiers.

  • LiRPA (Xu et al., NeurIPS 2020) is a generalization of CROWN on general computational graphs and we also provide an efficient GPU implementation, the auto_LiRPA library.

  • α-CROWN (sometimes referred to as optimized CROWN or optimized LiRPA) is used in the Fast-and-Complete verifier (Xu et al., ICLR 2021), which jointly optimizes intermediate layer bounds and final layer bounds in CROWN via variable α. α-CROWN typically has greater power than LP since LP cannot cheaply tighten intermediate layer bounds.

  • β-CROWN (Wang et al., NeurIPS 2021) incorporates split constraints in branch and bound (BaB) into the CROWN bound propagation procedure via an additional optimizable parameter β. The combination of efficient and GPU accelerated bound propagation with branch and bound produces a powerful and scalable neural network verifier.

  • BaB-Attack (Zhang et al., ICML 2022) is a strong falsifier (adversarial attack) based on branch and bound, which can find adversarial examples for hard instances where gradient or input-space-search based methods cannot succeed.

  • GCP-CROWN (Zhang et al., NeurIPS 2022) enables the use of general cutting planes methods for neural network verification in a GPU-accelerated and very efficient bound propagation framework. Cutting planes can significantly strengthen bound tightness.

We provide bibtex entries below:

@article{zhang2018efficient,
  title={Efficient Neural Network Robustness Certification with General Activation Functions},
  author={Zhang, Huan and Weng, Tsui-Wei and Chen, Pin-Yu and Hsieh, Cho-Jui and Daniel, Luca},
  journal={Advances in Neural Information Processing Systems},
  volume={31},
  pages={4939--4948},
  year={2018},
  url={https://arxiv.org/pdf/1811.00866.pdf}
}

@article{xu2020automatic,
  title={Automatic perturbation analysis for scalable certified robustness and beyond},
  author={Xu, Kaidi and Shi, Zhouxing and Zhang, Huan and Wang, Yihan and Chang, Kai-Wei and Huang, Minlie and Kailkhura, Bhavya and Lin, Xue and Hsieh, Cho-Jui},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  year={2020}
}

@article{salman2019convex,
  title={A Convex Relaxation Barrier to Tight Robustness Verification of Neural Networks},
  author={Salman, Hadi and Yang, Greg and Zhang, Huan and Hsieh, Cho-Jui and Zhang, Pengchuan},
  journal={Advances in Neural Information Processing Systems},
  volume={32},
  pages={9835--9846},
  year={2019}
}

@inproceedings{xu2021fast,
    title={{Fast and Complete}: Enabling Complete Neural Network Verification with Rapid and Massively Parallel Incomplete Verifiers},
    author={Kaidi Xu and Huan Zhang and Shiqi Wang and Yihan Wang and Suman Jana and Xue Lin and Cho-Jui Hsieh},
    booktitle={International Conference on Learning Representations},
    year={2021},
    url={https://openreview.net/forum?id=nVZtXBI6LNn}
}

@article{wang2021beta,
  title={{Beta-CROWN}: Efficient bound propagation with per-neuron split constraints for complete and incomplete neural network verification},
  author={Wang, Shiqi and Zhang, Huan and Xu, Kaidi and Lin, Xue and Jana, Suman and Hsieh, Cho-Jui and Kolter, J Zico},
  journal={Advances in Neural Information Processing Systems},
  volume={34},
  year={2021}
}

@InProceedings{zhang22babattack,
  title = 	 {A Branch and Bound Framework for Stronger Adversarial Attacks of {R}e{LU} Networks},
  author =       {Zhang, Huan and Wang, Shiqi and Xu, Kaidi and Wang, Yihan and Jana, Suman and Hsieh, Cho-Jui and Kolter, Zico},
  booktitle = 	 {Proceedings of the 39th International Conference on Machine Learning},
  volume = 	 {162},
  pages = 	 {26591--26604},
  year = 	 {2022},
}

@article{zhang2022general,
  title={General Cutting Planes for Bound-Propagation-Based Neural Network Verification},
  author={Zhang, Huan and Wang, Shiqi and Xu, Kaidi and Li, Linyi and Li, Bo and Jana, Suman and Hsieh, Cho-Jui and Kolter, J Zico},
  journal={Advances in Neural Information Processing Systems},
  year={2022}
}

Developers and Copyright

The α,β-CROWN verifier is developed by a team from CMU, UCLA, Drexel University, Columbia University and UIUC:

Team lead:

Main developers:

Contributors:

Advisors:

Our library is released under the BSD 3-Clause license. A copy of the license is included here.

alpha-beta-crown's People

Contributors

huanzhang12 avatar chelseas avatar tcwangshiqi-columbia avatar yihanwang617 avatar shizhouxing avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.