2.2.2. Importing A Model Using A Parser In C++

  • 0 replies
  • 238 views
*

sisiy

  • *****
  • 175
    • 查看个人资料
2.2.2. Importing A Model Using A Parser In C++
« 于: 九月 04, 2019, 03:03:46 pm »
要使用c++解析器API导入模型,需要执行以下高级步骤:
1.创建TensorRT构建器和网络。
程序代码: [选择]
IBuilder* builder = createInferBuilder(gLogger);
nvinfer1::INetworkDefinition* network = builder->createNetwork();

如何创建日志程序的例子,可以访问 Instantiating TensorRT Objects in C++https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#initialize_library

2.为特定格式创建TensorRT解析器。
程序代码: [选择]
ONNX
auto parser = nvonnxparser::createParser(*network, gLogger);
UFF
auto parser = nvuffparser::createUffParser();
Caffe
auto parser = nvcaffeparser1::createCaffeParser();

 

Use the parser to parse the imported model and populate the network.
parser->parse(args);
The specific args depend on what format parser is used. For more information, refer to the parsers documented in the TensorRT API.