python3
pip3 install -r requirements.txt
- Edit
tags.txt
according your dataset.
-
docker build -t bert_ner .
-
NV_GPU=1 nvidia-docker run -itd --rm --shm-size=32g --ulimit memlock=-1 -v ${PWD}:/app bert_ner bash
-
python run_ner.py --data_dir=data/ --bert_model=bert-base-cased-pt-br --task_name=ner --output_dir=out_base --max_seq_length=128 --do_train --num_train_epochs 5 --do_eval --warmup_proportion=0.1 --eval_on=test
python run_ner.py --data_dir=data/ --bert_model=bert-base-cased-pt-br --task_name=ner --output_dir=out_base --max_seq_length=128 --do_train --num_train_epochs 5 --do_eval --warmup_proportion=0.1
python run_ner.py --data_dir=data/ --bert_model=bert-base-cased-pt-br --train_batch_size=8 --task_name=ner --output_dir=out_base --max_seq_length=128 --do_train --num_train_epochs 5 --do_eval --warmup_proportion=0.1
BERT-BASE Pretrained model download from here
** To use pytorch with notebook: conda install pytorch torchvision cuda90 -c pytorch
from bert import Ner
model = Ner("out_base/")
output = model.predict("Steve went to Paris")
print(output)
'''
[
{
"confidence": 0.9981840252876282,
"tag": "B-PER",
"word": "Steve"
},
{
"confidence": 0.9998939037322998,
"tag": "O",
"word": "went"
},
{
"confidence": 0.999891996383667,
"tag": "O",
"word": "to"
},
{
"confidence": 0.9991968274116516,
"tag": "B-LOC",
"word": "Paris"
}
]
'''
Pretrained and converted bert-base model download from here
Download libtorch from here
-
install
cmake
, tested withcmake
version3.10.2
-
unzip downloaded model and
libtorch
inBERT-NER
-
Compile C++ App
cd cpp-app/ cmake -DCMAKE_PREFIX_PATH=../libtorch
make
-
Runing APP
./app ../base
NB: Bert-Base C++ model is split in to two parts.
- Bert Feature extractor and NER classifier.
- This is done because
jit trace
don't supportinput
dependedfor
loop orif
conditions insideforword
function ofmodel
.
BERT NER model deployed as rest api
python api.py
API will be live at 0.0.0.0:8000
endpoint predict
curl -X POST http://0.0.0.0:8000/predict -H 'Content-Type: application/json' -d '{ "text": "Steve went to Paris" }'
Output
{
"result": [
{
"confidence": 0.9981840252876282,
"tag": "B-PER",
"word": "Steve"
},
{
"confidence": 0.9998939037322998,
"tag": "O",
"word": "went"
},
{
"confidence": 0.999891996383667,
"tag": "O",
"word": "to"
},
{
"confidence": 0.9991968274116516,
"tag": "B-LOC",
"word": "Paris"
}
]
}