24 Star 226 Fork 92

PaddlePaddle / PaddleSeg

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
model_export_onnx.md 1.43 KB
一键复制 编辑 原始数据 按行查看 历史

English | 简体中文

Export model with ONNX format

After training the model by PaddleSeg, we also support exporting model with ONNX format. This tutorial provides an example to introduce it.

For the complete method of exporting ONNX format models, please refer to Paddle2ONNX

1.Export the inference model

Refer to document to export model, and save the exported inference model to the output folder, as follows.

./output
  ├── deploy.yaml            # deployment-related profile
  ├── model.pdmodel          # topology file of inference model
  ├── model.pdiparams        # weight file of inference model
  └── model.pdiparams.info   # additional information, generally do not need attention to this file

2. Export ONNX format model

Install Paddle2ONNX (version 0.6 or higher).

pip install paddle2onnx

Execute the following command to export the prediction model in the output folder to an ONNX format model by Paddle2ONNX.

paddle2onnx --model_dir output \
            --model_filename model.pdmodel \
            --params_filename model.pdiparams \
            --opset_version 11 \
            --save_file output.onnx

The exported ONNX format model is saved as output.onnx file.

Reference documents:

Python
1
https://gitee.com/paddlepaddle/PaddleSeg.git
git@gitee.com:paddlepaddle/PaddleSeg.git
paddlepaddle
PaddleSeg
PaddleSeg
release/2.7

搜索帮助