10 Star 20 Fork 12

DeepSpark/DeepSparkInference

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README

BERT Base NER (IGIE)

Model Description

BERT is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As a result, the pre-trained BERT model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications.

Supported Environments

Iluvatar GPU IXUCA SDK
MR-V100 4.2.0

Model Preparation

Prepare Resources

Pretrained model: https://huggingface.co/bert-base-chinese

Dataset: http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz

Install Dependencies

pip3 install -r requirements.txt

Model Conversion

export DATASETS_DIR=/Path/to/china-people-daily-ner-corpus/

# Get pytorch weights
python3 get_weights.py

# Do QAT for INT8 test, will take a long time  
cd Int8QAT/
python3 run_qat.py --model_dir ../test/ --datasets_dir ${DATASETS_DIR}
python3 export_hdf5.py --model quant_base/pytorch_model.bin
cd ../

Model Inference

INT8

# Accuracy
bash scripts/infer_bert_base_ner_int8_accuracy.sh
# Performance
bash scripts/infer_bert_base_ner_int8_performance.sh

Model Results

Model BatchSize SeqLength Precision FPS F1 Score
BERT Base NER 8 256 INT8 2067.252 96.2
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
Python
1
https://gitee.com/deep-spark/deepsparkinference.git
git@gitee.com:deep-spark/deepsparkinference.git
deep-spark
deepsparkinference
DeepSparkInference
master

搜索帮助

371d5123 14472233 46e8bd33 14472233