代码拉取完成,页面将自动刷新
English | 简体中文
After training the model by PaddleSeg, we also support exporting model with ONNX format. This tutorial provides an example to introduce it.
For the complete method of exporting ONNX format models, please refer to Paddle2ONNX。
Refer to document to export model, and save the exported inference model to the output folder, as follows.
./output
├── deploy.yaml # deployment-related profile
├── model.pdmodel # topology file of inference model
├── model.pdiparams # weight file of inference model
└── model.pdiparams.info # additional information, generally do not need attention to this file
Install Paddle2ONNX (version 0.6 or higher).
pip install paddle2onnx
Execute the following command to export the prediction model in the output folder to an ONNX format model by Paddle2ONNX.
paddle2onnx --model_dir output \
--model_filename model.pdmodel \
--params_filename model.pdiparams \
--opset_version 11 \
--save_file output.onnx
The exported ONNX format model is saved as output.onnx file.
Reference documents:
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。