Giter Site home page Giter Site logo

ansonb / feta_tmlr Goto Github PK

View Code? Open in Web Editor NEW
10.0 1.0 1.0 961 KB

This repository reproduces the results in the paper "How expressive are transformers in spectral domain for graphs?"(published in TMLR)

License: MIT License

Python 96.99% Shell 0.15% Jupyter Notebook 0.43% Makefile 0.01% C++ 1.21% Cuda 0.76% Cython 0.45%
graph-representation-learning graph-transformer spectral-graph-theory

feta_tmlr's Introduction

How Expressive are Transformers in Spectral Domain for Graphs?

This repository implements the approach outlined in the paper "How Expressive are Transformers in Spectral Domain for Graphs?". Note: This is the code that was submitted for the review. For purposes of keeping information anonymous the code(variables, function names, comments etc.) may not be very understandable. We are in the process of cleaning the code and will push the changes post this. Please write back in case of any queries.

Short Description about FeTA

Figure from paper

FeTA is a module designed for learning transformations on the spectral components of graph structured data.

Installation

Environment:

numpy=1.18.1
scipy=1.3.2
Cython=0.29.23
scikit-learn=0.22.1
matplotlib=3.4
networkx=2.5
python=3.7
pytorch=1.6
torch-geometric=1.7
rdkit-pypi=2021.03.3
dgl==0.6.1

To begin with, run:

cd FeTA
. s_env

To install GCKN, you also need to run:

make

You also need to create a cache folder to store computed positional encoding

mkdir -p cache/pe

Training FeTA

To train FeTA on MUTAG, run:

python -m experiments.run_transformer_gengcn_cv --gnn_type='ChebConvDynamic' --seed=0 --dataset='MUTAG'

To train FeTA on NCI1, run:

python -m experiments.run_transformer_gckn_gengcn_cv --gnn_type='ChebConvDynamic' --seed=0 --dataset='NCI1'

To train FeTA on ZINC, run:

python -m experiments.run_transformer_gckn_gengcn --gnn_type='ChebConvDynamic' --seed=0 --dataset='ZINC'

To train FeTA on MolHIV, run:

python -m experiments.run_transformer_gengcn_molhiv --gnn_type='ChebConvDynamic' --batch-size=1024 --epochs=10

To train FeTA on PATTERN, run:

python -m experiments.run_transformer_gengcn_SBM_cv --gnn_type='ChebConvDynamic' --batch-size=64 --epochs=100 --dataset='PATTERN'

To train FeTA on CLUSTER, run:

python -m experiments.run_transformer_gengcn_SBM_cv --gnn_type='ChebConvDynamic' --batch-size=64 --epochs=100 --dataset='CLUSTER'

To run the optimized configurations of FeTA (along with the position encodings) the scripts are provided in the folder scripts under the respective dataset.

Here --fold-idx can be varied from 1 to 10 to train on a specified training fold. To test a selected model, just add the --test (this will use complete train + val data) flag.

Configurations from GraphiT paper: To include Laplacian positional encoding into input node features, run:

python run_transformer_gengcn_cv.py --dataset NCI1 --gnn_type='ChebConvDynamic' --fold-idx 1 --pos-enc diffusion --beta 1.0 --lappe --lap-dim 8

To include GCKN path features into input node features, run:

python run_transformer_gckn_gengcn_cv.py --dataset NCI1 --gnn_type='ChebConvDynamic' --fold-idx 1 --pos-enc diffusion --beta 1.0 --gckn-path 5

Regression

To train FeTA on ZINC, run:

python run_transformer_gengcn.py --gnn_type='ChebConvDynamic' --pos-enc diffusion --beta 1.0

To include Laplacian positional encoding into input node features, run:

python run_transformer_gengcn.py --gnn_type='ChebConvDynamic' --pos-enc diffusion --beta 1.0 --lappe --lap-dim 8

To include GCKN path features into input node features, run:

python run_transformer_gckn_gengcn.py --gnn_type='ChebConvDynamic' --pos-enc diffusion --beta 1.0 --gckn-path 8

Citation

If you use our work kindly consider citing

@article{
bastos2022how,
title={How Expressive are Transformers in Spectral Domain for Graphs?},
author={Anson Bastos and Abhishek Nadgeri and Kuldeep Singh and Hiroki Kanezashi and Toyotaro Suzumura and Isaiah Onando Mulang'},
journal={Transactions on Machine Learning Research},
year={2022},
url={https://openreview.net/forum?id=aRsLetumx1},
note={}
}

Acknowledgements

The code has been adapted from Graph Transformer, GraphiT and SAN repositories.

feta_tmlr's People

Contributors

ansonb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

sweep76

feta_tmlr's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.