import com.mindspore.config.RunnerConfig;
ModelParallelRunner defines MindSpore Lite concurrent inference.
function |
---|
long getModelParallelRunnerPtr() |
boolean init() |
boolean predict() |
boolean getInputs() |
boolean getOutputs() |
void free() |
public long getModelParallelRunnerPtr()
Get the underlying concurrent inference class pointer.
Returns
Low-level concurrent inference class pointer.
public boolean init(String modelPath, RunnerConfig runnerConfig)
Read and load models according to the path, generate one or more models, and compile all models to a state that can be run on the Device.
Parameters
modelPath
: model file path.
runnerConfig
: A RunnerConfig structure. Defines configuration parameters for the concurrent inference model.
Returns
Whether the initialization is successful.
public boolean init(String modelPath)
Read and load models according to the path, generate one or more models, and compile all models to a state that can be run on the Device.
Parameters
modelPath
: model file path.Returns
Whether the initialization is successful.
public boolean predict(List<MSTensor> inputs, List<MSTensor> outputs)
Concurrent inference model.
Parameters
inputs
: model input.
outputs
: model output.
Returns
Whether the inference is successful.
public List<MSTensor> getInputs()
Get all input tensors of the model.
Returns
A list of input tensors for the model.
public List<MSTensor> getOutputs()
Get all input tensors of the model.
Returns
A list of output tensors for the model.
public void free()
Free concurrent inference class memory.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。