代码拉取完成,页面将自动刷新
TFCC is a C++ deep learning inference framework.
TFCC provides the following toolkits that faciliate your development and deployment of your trained DL models:
Porject | Source | Description |
---|---|---|
TFCC | ./tfcc |
The core of deep learning inference library. It provides friendly interfaces for model deployment, as well as the implementation of diverse operations in both MKL and CUDA environments. |
TFCC Code Generator | ./tfcc_code_generator |
An automatic generator that can optimize the structure of your high-level models (tensorflows, pytorch, etc.) and generate the TFCC model. |
TFCC Runtime | ./tfcc_runtime |
An runtime to load TFCC model and inference. |
Run
./build.sh ${INSTALL_PREFIX_PATH}
Convert Model
The script generator.py
can convert onnx model or tensorflow model to tfcc model. The docs Convert ONNX Model and Convert TF Model show the details.
Load Model
There is a simple way to load a model as following code:
// load tfcc model to a string.
std::string modelData = load_data_from_file(path);
tfcc::runtime::Model model(modelData);
Inference
Finally run the model
tfcc::runtime::data::Inputs inputs;
tfcc::runtime::data::Outputs outputs;
// set inputs
auto item = inputs.add_items();
item->set_name("The input name");
item->set_dtype(tfcc::runtime::common::FLOAT);
std::vector<float> data = {1.0, 2.0};
item->set_data(data.data(), data.size() * sizeof(float));
model.run(inputs, outputs);
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。