Giter Site home page Giter Site logo

upcreat / survey-for-gnns-in-nlp Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ardaaras99/survey-for-gnns-in-nlp

0.0 0.0 0.0 9 KB

An extensive survey on the Graph Neural Networks and their applications in Natural Language Processing. Methods of the papers will be given as a short description under the links of papers.

survey-for-gnns-in-nlp's Introduction

Main Papers of GNNs in the NLP domain

Contributed by Arda Can Aras.

Preliminary Definitions

  • Transductive: You have a single graph (like Cora) you split some nodes (and not graphs) into train/val/test training sets. While you're training you'll be using only the labels from your training nodes. During the forward prop, by the nature of how spatial GNNs work, you'll be aggregating the feature vectors from your neighbors and some of them may belong to val or even test sets! The main point is - you ARE NOT using their label information but you ARE using the structural information and their features.

  • Inductive: You have a set of training graphs, a separate set of val graphs and of course a separate set of test graphs. Generally aim is to learn generalizable transformations of embeddings instead of having direct matrix multiplications like in GCNConv.

  • Heterogeneous Graph: Graphs that can be directed or can include different type of nodes like in TextGCN (document and word nodes).

Note: For more comprehensive and general GNNs implementation in several different fields, please check this GitHub repo.

1. Survey
2. Models
2.1 Basic Models 2.2 Graph Types
2.3 Pooling Methods 2.4 Analysis
2.5 Efficiency 2.6 Explainability
3. Natural Language Processing
3.1 Text Classification 3.2 Question Answering
  1. Graph Neural Networks for Natural Language Processing: A Survey . paper

    Lingfei Wu, Yu Chen, Kai Shen, Xiaojie Guo, Hanning Gao, Shucheng Li, Jian Pei, Bo Long

The following models are not inherently designed to tackle problems in NLP domain. However, they are easily applicable and their implementations are available at PyG (PyTorch Geometric) website.

  1. Semi-Supervised Classification with Graph Convolutional Networks Thomas N. Kipf, Max Welling. paper (GCNConv)

    • Short Model Description (SMD): Basic aggregation of neighbor nodes with transductive manner. Simple matrix multiplication is possible due to its nature.

  2. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering Michaël Defferrard, Xavier Bresson, Pierre Vandergheynst. paper (ChebConv)

    • SMD:

  3. Inductive Representation Learning on Large Graphs William L. Hamilton, Rex Ying, Jure Leskovec. paper

    • SMD: Trying to learn generalizable aggregation functions in an inductive manner. 3 main approaches tried for aggregation in paper (sum,pool,lstm).

  4. Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks Christopher Morris, Martin Ritzert, Matthias Fey, William L. Hamilton, Jan Eric Lenssen, Gaurav Rattan, Martin Grohe. paper

    • SMD:

  5. Gated Graph Sequence Neural Networks Yujia Li, Daniel Tarlow, Marc Brockschmidt, Richard Zemel paper

    • SMD:

  6. Residual Gated Graph ConvNets Xavier Bresson, Thomas Laurent paper

    • SMD:

  7. Graph Attention Networks Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, Yoshua Bengio paper

    • SMD:

  8. How Attentive are Graph Attention Networks? Shaked Brody, Uri Alon, Eran Yahav paper

    • SMD:

  9. Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification Yunsheng Shi, Zhengjie Huang, Shikun Feng, Hui Zhong, Wenjin Wang, Yu Sun paper

    • SMD:

  10. Attention-based Graph Neural Network for Semi-supervised Learning Kiran K. Thekumparampil, Chong Wang, Sewoong Oh, Li-Jia Li paper

    • SMD:

more
  1. **** ** paper

    • SMD:

  2. **** ** paper

    • SMD:

  3. **** ** paper

    • SMD:

  4. **** ** paper

    • SMD:

  5. **** ** paper

    • SMD:

  6. **** ** paper

    • SMD:

  7. **** ** paper

    • SMD:

Datasets & Bench Marks

  1. Text Classification
  2. Text Classification on 20NEWS
  3. Text Classification on R8
  4. Text Classification on MR
  5. Text Classification on R52
  6. Text Classification on Ohsumed

Papers

The following papers are from the GNN field only. However, there are some datasets that GNNs are dominated by other algorithms. It is possible to include GNNs in those algorithms and get better combinations.

  1. Graph Convolutional Networks for Text Classification Liang Yao, Chengsheng Mao, Yuan Luo. paper (TextGCN)

    • SMD: GCN implementation with Adjacency matrix entries are predefined with TF-IDF and PPMI scores. No solution provided for iterartive updates for newly incoming data. Transductive approach.
    • Future Work:

  2. BertGCN: Transductive Text Classification by Combining GCN and BERT Yuxiao Lin, Yuxian Meng, Xiaofei Sun, Qinghong Han, Kun Kuang, Jiwei Li, Fei Wu. paper (BertGCN,RobertaGCN)

    • SMD: BertGCN constructs a heterogeneous graph over the dataset and represents initial document as nodes using BERT representations. Jointly trains BERT and GCN modules. Learns representation for both training adata and unlabeled test data. Finally, BERT and GCN results are interpolated with trainable trade-off coefficient to obtain final representation. Transductive approach.
    • Future Work: In this work, they only use document statistics to build the graph, which might be sub-optimal compared to models that are able to automatically construct edges between nodes. We leave this in future work.

  3. Every Document Owns Its Structure: Inductive Text Classification via Graph Neural Networks Yufeng Zhang, Xueli Yu, Zeyu Cui, Shu Wu, Zhongzhen Wen, Liang Wang. paper (TextING)

    • SMD:
    • Future Work:

  4. Simple Spectral Graph Convolution Hao Zhu, Piotr Koniusz. paper (SSGC)

    • SMD:
    • Future Work:

  5. **** . paper ()

    • SMD:
    • Future Work:

  6. **** . paper ()

    • SMD:
    • Future Work:

  7. **** . paper ()

    • SMD:
    • Future Work:

  8. **** . paper ()

    • SMD:
    • Future Work:

  9. **** . paper ()

    • SMD:
    • Future Work:

  10. **** . paper ()

    • SMD:
    • Future Work:

survey-for-gnns-in-nlp's People

Contributors

ardaaras99 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.