玩轉OpenVINO之二:試運行mask_rcnn_demo

調試open_model_zoo/mask_rcnn_demo

接前面一篇,編譯好了DEMO,我們繼續玩轉OpenVINO,試一下open_model_zoo中的模型。

假設你要調試open_model_zoo中的某個模型(注意,下面的命令中,使用你自己想用的模型,我一般是好幾個同時轉換,完了隨機測試的)。

先是要完成Optimizer工作,轉換模型得到IR文件,

命令如下

python mo_tf.py --input_model 
E:/mask_rcnn_resnet50_atrous_coco_2018_01_28/frozen_inference_graph.pb
--tensorflow_use_custom_operations_config extensions/front/tf/mask_rcnn_support.json
--tensorflow_object_detection_api_pipeline_config E:/mask_rcnn_inception_resnet_v2_atrous_coco_2018_01_28/pipeline.config

喜歡用vscode調試的朋友可以看下面的launch.json文件,

{
    // Use IntelliSense to learn about possible attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Python: 當前文件",
            "type": "python",
            "request": "launch",
            "program": "${file}",
            "console": "integratedTerminal",
            "justMyCode": false,
            "args": [
                "--input_model","E:\\mask_rcnn_resnet50_atrous_coco_2018_01_28\\frozen_inference_graph.pb", 
                "--tensorflow_use_custom_operations_config","D:/devOpenVino/openvino_2020.3.194/deployment_tools/model_optimizer/extensions/front/tf/mask_rcnn_support.json",
                "--tensorflow_object_detection_api_pipeline_config","E:/mask_rcnn_inception_resnet_v2_atrous_coco_2018_01_28/pipeline.config"
            ]
        }
    ]
}

如果你和我一樣,沒有指定輸出文件名,轉換完成後得到的都是frozen_inference_graph.bin和frozen_inference_graph.xml文件,要注意改成相應的模型文件名,否則多了就弄混了。

下面,我們開始用VS2019中的C++ Demo來測試這些模型。這些DEMO在我們上一講《玩轉OpenVINO_cpp samples的編譯》中已經編譯好了,現在拿來用。

添加路徑一

如果你使用debug版本,那麼環境變量path路徑設置中添加

C:\IntelSWTools\openvino_2020.3.194\deployment_tools\inference_engine\bin\intel64\Debug

同時,把opencv_world430d.dll拷貝到該文件夾下面(不想拷貝的話,自己添加路徑也行,反正就是讓程序能找到這個dll文件)

如果是release版本,則添加

C:\IntelSWTools\openvino_2020.3.194\deployment_tools\inference_engine\bin\intel64\Release,

同時,把opencv_world430.dll拷貝到該文件夾下面

總的來說,這裏有不少dll文件是intel ineference_engine要用到的。

添加路徑二

還有一些路徑也是必須添加的,

C:\IntelSWTools\openvino_2020.3.194\deployment_tools\inference_engine\external\tbb\bin

C:\IntelSWTools\openvino_2020.3.194\deployment_tools\ngraph\lib

調試運行

運行的項目名稱是mask_rcnn_demo。

具體可參考:https://docs.openvinotoolkit.org/latest/_demos_mask_rcnn_demo_README.html

我把說明摘錄一部分如下(注意:這裏是linux下的格式,我後面說明中用到的是windows系統中的格式,在命令使用上有點小小的差異)

./mask_rcnn_demo -h
InferenceEngine:
    API version ............ <version>
    Build .................. <number>
mask_rcnn_demo [OPTION]
Options:
    -h                                Print a usage message.
    -i "<path>"                       Required. Path to a .bmp image.
    -m "<path>"                       Required. Path to an .xml file with a trained model.
      -l "<absolute_path>"            Required for CPU custom layers. Absolute path to a shared library with the kernels implementations.
          Or
      -c "<absolute_path>"            Required for GPU custom kernels. Absolute path to the .xml file with the kernels descriptions.
    -d "<device>"                     Optional. Specify the target device to infer on (the list of available devices is shown below). Use "-d HETERO:<comma-separated_devices_list>" format to specify HETERO plugin. The demo will look for a suitable plugin for a specified device (CPU by default)
    -detection_output_name "<string>" Optional. The name of detection output layer. Default value is "reshape_do_2d"
    -masks_name "<string>"            Optional. The name of masks layer. Default value is "masks"

查看幫助文檔: mask_rcnn_demo --h

C:\IntelSWTools\openvino_2020.3.194\deployment_tools\open_model_zoo\demos\dev\intel64\Debug>mask_rcnn_demo --h
InferenceEngine: 00007FFCC7C49BC8

mask_rcnn_demo [OPTION]
Options:

    -h                                Print a usage message.
    -i "<path>"                       Required. Path to a .bmp image.
    -m "<path>"                       Required. Path to an .xml file with a trained model.
      -l "<absolute_path>"            Required for CPU custom layers. Absolute path to a shared library with the kernels implementations.
          Or
      -c "<absolute_path>"            Required for GPU custom kernels. Absolute path to the .xml file with the kernels descriptions.
    -d "<device>"                     Optional. Specify the target device to infer on (the list of available devices is shown below). Use "-d HETERO:<comma-separated_devices_list>" format to specify HETERO plugin. The demo will look for a suitable plugin for a specified device (CPU by default)
    -detection_output_name "<string>" Optional. The name of detection output layer. Default value is "reshape_do_2d"
    -masks_name "<string>"            Optional. The name of masks layer. Default value is "masks"

Available target devices:  CPU  GNA

這裏圖片必須是bmp格式。

如何輸入圖片地址呢?官方給出的命令如下,

./mask_rcnn_demo -i <path_to_image>/inputImage.bmp -m <path_to_model>/mask_rcnn_inception_resnet_v2_atrous_coco.xml

事實上,用命令行輸入的方式, OpenVINO中由一個叫args_helper.hpp的文件來處理,其中一段的代碼如下,

/**
* @brief This function find -i/--images key in input args
*        It's necessary to process multiple values for single key
* @return files updated vector of verified input files
*/
inline void parseInputFilesArguments(std::vector<std::string> &files) {
    std::vector<std::string> args = gflags::GetArgvs();
    bool readArguments = false;
    for (size_t i = 0; i < args.size(); i++) {
        if (args.at(i) == "-i" || args.at(i) == "--images") {
            readArguments = true;
            continue;
        }
        if (!readArguments) {
            continue;
        }
        if (args.at(i).c_str()[0] == '-') {
            break;
        }
        readInputFilesArguments(files, args.at(i));
    }
}

就是說,輸入圖片的格式爲以下兩者都可以,

-i  xyz.bmp  或者  --images <沒仔細研究,這裏是要文件夾吧還是xyz.bmp>

在VS2019中調試運行的話,直接把項目的調試參數設置爲上述格式即可,例如,

-i J:\BigData\default.bmp -m E:\mask_rcnn_resnet50_atrous_coco_2018_01_28\frozen_inference_graph.xml

我用純CPU試了一下這個DEBUG模式,超級慢啊!在Release模式下,隨便找了一張圖,大約也花了好幾秒,感覺不出來哪裏加快了。

當然,要琢磨的地方還很多,這裏暫不涉及這些細節了,先玩起來吧。

 

 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章