Giter Site home page Giter Site logo

bon-nat's Introduction

Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation

PyTorch implementation of the models described in the paper Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation .

Dependencies

Python

  • Python 3.6
  • PyTorch >= 0.4
  • Numpy
  • NLTK
  • torchtext 0.2.1
  • torchvision
  • revtok
  • multiset
  • ipdb

Related code

Downloading Datasets

The original translation corpora can be downloaded from (IWLST'16 En-De, WMT'16 En-Ro, WMT'14 En-De). We recommend you to download the preprocessed corpora released in dl4mt-nonauto. Set correct path to data in data_path() function located in data.py before you run the code.

BoN-Joint

Combine the BoN objective and the cross-entropy loss to train NAT from scratch. This process usually takes about 5 days.

$ sh joint_wmt.sh

Take a checkpoint and train the length prediction model. This process usually takes about 1 day.

$ sh tune_wmt.sh

Decode the test set. This process usually takes about 20 seconds.

$ sh decode_wmt.sh

BoN-FT

First, train a NAT model using the cross-entropy loss. This process usually takes about 5 days.

$ sh mle_wmt.sh

Then, take a pre-trained checkpoint and finetune the NAT model using the BoN objective. This process usually takes about 3 hours.

$ sh bontune_wmt.sh

Take a finetuned checkpoint and train the length prediction model. This process usually takes about 1 day.

$ sh tune_wmt.sh

Decode the test set. This process usually takes about 20 seconds.

$ sh decode_wmt.sh

Reinforce-NAT

We also implement Reinforce-NAT (line 1294-1390) described in the paper Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation. See RSI-NAT for the usage.

Citation

If you find the resources in this repository useful, please consider citing:

@article{Shao:19,
  author    = {Chenze Shao, Jinchao Zhang, Yang Feng, Fandong Meng and Jie Zhou},
  title     = {Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation},
  year      = {2019},
  journal   = {arXiv preprint arXiv:1911.09320},
}

bon-nat's People

Contributors

shaochenze avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.