Bidirectional Encoder Representations from Transformers or BERT has been a popular technique in NLP since Google open sourced it in 2018. Using minimal task-specific fine-tuning efforts, researchers have been able to surpass multiple benchmarks by leveraging pre-trained models that can easily be implemented to produce state of the art results.
For a detailed read, check out the article.
This is a simple example implementation of bert-as-a-service for sentence similarity.
- Install the required server and client
pip install bert-serving-server # server
pip install bert-serving-client # client, independent of `bert-serving-server`
- Start the BERT service. Note that you will have to choose the correct path and pre-trained model name for BERT.
bert-serving-start -model_dir /tmp/english_L-12_H-768_A-12/ -num_worker=4
- Run your test script
python BERT_test.py