99 Star 788 Fork 1.4K

MindSpore / models

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
README.md 5.18 KB
一键复制 编辑 原始数据 按行查看 历史

MindSpore Inference with C++ Deployment Guide

Overview

This tutorial aims at the edge scenarios inference deployment based on the MindIR model file exported by MindSpore.

Deployed on Ascend910、Ascend310、Ascend310P、Nvidia GPU、CPU.

Note C++ inference must use MindSpore Lite after MindSpore2.0. When could not use MindSpore Lite, please install MindSpore < 2.0.

Installation Guide

Note: MindSpore Lite applied python3.7. Please prepare the environment for Python 3.7 before installing.

Install MindSpore

Please install MindSpore refer to MindSpore Install

Install MindSpore Lite

Refer to Lite install

  1. Download the supporting tar.gz and whl packages according to the environment.

  2. Unzip the tar.gz package and install the corresponding version of the WHL package.

tar -zxvf mindspore-lite-*.tar.gz
pip install mindspore_lite-*.whl

Configure environment variables

When run on Ascend, please set Ascend environment variables. ASCEND_PATH is the install folder path of Ascend, may be /usr/local/Ascend/tookits or /usr/local/Ascend/latest.

source $ASCEND_PATH/bin/setenv.bash

LITE_HOME is the folder path extracted from tar.gz, and it is recommended to use an absolute path.

source $ASCEND_PATH/bin/setenv.bash
export LITE_HOME=/path/to/mindspore-lite-{version}-{os}-{platform}
export LD_LIBRARY_PATH=$LITE_HOME/runtime/lib:$LITE_HOME/tools/converter/lib:$LD_LIBRARY_PATH
export PATH=$LITE_HOME/tools/converter/converter:$LITE_HOME/tools/benchmark:$PATH

Note: MindSpore and MindSpore Lite must be same version.

Inference process

A typical inference process includes:

  • Export MindIR
  • Data pre-processing(optional)
  • Inference model compilation and execution
  • Inference result post-processing

Please referring to resnet(data processing with C++) and DBNet(data processing with python to bin file).

Export MindIR

MindSpore provides a unified Intermediate Representation (IR) for cloud side (training) and end side (inference). Models can be saved as MindIR directly by using the export interface.

import mindspore as ms
from src.model_utils.config import config
from src.model_utils.env import init_env
# Environment initialization
init_env(config)
# Inference model
net = Net()
# Load model
ms.load_checkpoint("xxx.ckpt", net)
# Construct the input, only need to set the shape and type of the input
inp = ms.ops.ones((1, 3, 224, 224), ms.float32)
# Export model, file_format support 'MINDIR', 'ONNX' and 'AIR'
ms.export(net, ms.export(net, inp, file_name=config.file_name, file_format=config.file_format))
# When using multi inputs
# inputs = [inp1, inp2, inp3]
# ms.export(net, ms.export(net, *inputs, file_name=config.file_name, file_format=config.file_format))

Data pre-processing(optional)

Some data processing is difficult to implement in C++, sometime saving the data as bin files first.

import os
from src.dataset import create_dataset
from src.model_utils.config import config

dataset = create_dataset(config, is_train=False)
it = dataset.create_dict_iterator(output_numpy=True)
input_dir = config.input_dir
for i,data in enumerate(it):
    input_name = "eval_input_" + str(i+1) + ".bin"
    input_path = os.path.join(input_dir, input_name)
    data['img'].tofile(input_path)

The input bin file is generated in the dir directory config.input_path.

Inference model development

This involves the creation and compilation process of the C++ project. Generally, the directory structure of a C++ inference project is as follows:

└─cpp_infer
    ├─build.sh                # C++ compilation script
    ├─CmakeLists.txt          # Cmake configuration file
    ├─main.cc                 # Model execution script
    └─common_inc              # Common header file

Generally, it doesn't need to change build.sh, CmakeLists.txt, common_inc, a general script is provided under the 'example' directory.

When developing a new model, you need to copy these files to the execution directory and write the 'main.cc' execution of the corresponding model.

There is no 'common_inc' under execution directory in some models, you need to copy it to the same directory as 'main.cc' during execution.

main.cc generally including:

  • Model loading and construction
  • Dataset construction / bin file loading
  • Model Inference
  • Inference result saving

Please refer to MindSpore 310 infer, and implemented inference model, for example resnet C++ inference.

Inference model compilation and execution

Generally, we need to provide a run_infer_cpp.sh for connecting the whole inference process. For details, please refer to resnet.

1
https://gitee.com/mindspore/models.git
git@gitee.com:mindspore/models.git
mindspore
models
models
master

搜索帮助