在移動端這裏就不使用python而是使用C++作爲開發語言,總體流程就是我們在PC端交叉編譯出相關執行程序,然後在jetson nano上直接部署運行,不考慮在jetson nano上編譯,使用交叉編譯這也更加符合嵌入式軟件開發的流程(雖然jetson nano和raspberry pi都自帶系統可以下載編譯代碼。。。。。)
1. PC機上交叉編譯出tflite靜態庫
首先需要做的是在PC機上從github上拉取tensorflow源代碼,在源代碼中使用交叉編譯工具編譯出tensorflow lite相關的靜態庫。
具體的方法可參考官網guide:https://tensorflow.google.cn/lite/guide/build_arm64
#安裝交叉編譯工具
sudo apt-get install crossbuild-essential-arm64
#下載tensorflow源代碼
git clone https://github.com/tensorflow/tensorflow.git
#進入lite文件夾
cd tensorflow/tensorflow/lite/tools/make
#運行tflite相關所需依賴環境腳本
./download_dependencies.sh
#編譯
./build_aarch64_lib.sh
這樣我們就在 tensorflow/lite/tools/make/gen/linux_aarch64/lib目錄編譯出了tensorflow lite靜態庫(benchmark-lib.a libtensorflow-lite.a)
2. 編譯示例 label_image
上一步中我們本地編譯出了tflite的靜態庫,這一步我們在tensorflow源碼中編譯出其自帶的label_image示例,源代碼位置在:
tensorflow/lite/examples/label_image目錄下
前面在生成tflite靜態庫的時候編譯了tensorflow/lite/examples/目錄下的minimal,但是並沒有編譯 label_image,所以我們需要修改
tensorflow/lite/tools/make/Makefile文件,具體修改如下(參照minimal例子對應添加了label_image相關編譯命令):
diff --git a/tensorflow/lite/tools/make/Makefile b/tensorflow/lite/tools/make/Makefile
index ad3832f996..c32091594f 100644
--- a/tensorflow/lite/tools/make/Makefile
+++ b/tensorflow/lite/tools/make/Makefile
@@ -97,6 +97,12 @@ BENCHMARK_PERF_OPTIONS_BINARY_NAME := benchmark_model_performance_options
MINIMAL_SRCS := \
tensorflow/lite/examples/minimal/minimal.cc
+# galaxyzwj
+LABEL_IMAGE_SRCS := \
+ tensorflow/lite/examples/label_image/label_image.cc \
+ tensorflow/lite/examples/label_image/bitmap_helpers.cc \
+ tensorflow/lite/tools/evaluation/utils.cc
+
# What sources we want to compile, must be kept in sync with the main Bazel
# build files.
@@ -161,7 +167,8 @@ $(wildcard tensorflow/lite/*/*/*/*/*/*tool.cc) \
$(wildcard tensorflow/lite/kernels/*test_main.cc) \
$(wildcard tensorflow/lite/kernels/*test_util*.cc) \
tensorflow/lite/tflite_with_xnnpack.cc \
-$(MINIMAL_SRCS)
+$(MINIMAL_SRCS) \
+$(LABEL_IMAGE_SRCS)
BUILD_WITH_MMAP ?= true
ifeq ($(BUILD_TYPE),micro)
@@ -257,7 +264,8 @@ ALL_SRCS := \
$(PROFILER_SUMMARIZER_SRCS) \
$(TF_LITE_CC_SRCS) \
$(BENCHMARK_LIB_SRCS) \
- $(CMD_LINE_TOOLS_SRCS)
+ $(CMD_LINE_TOOLS_SRCS) \
+ $(LABEL_IMAGE_SRCS)
# Where compiled objects are stored.
TARGET_OUT_DIR ?= $(TARGET)_$(TARGET_ARCH)
@@ -271,6 +279,7 @@ BENCHMARK_LIB := $(LIBDIR)$(BENCHMARK_LIB_NAME)
BENCHMARK_BINARY := $(BINDIR)$(BENCHMARK_BINARY_NAME)
BENCHMARK_PERF_OPTIONS_BINARY := $(BINDIR)$(BENCHMARK_PERF_OPTIONS_BINARY_NAME)
MINIMAL_BINARY := $(BINDIR)minimal
+LABEL_IMAGE_BINARY := $(BINDIR)label_image
CXX := $(CC_PREFIX)${TARGET_TOOLCHAIN_PREFIX}g++
CC := $(CC_PREFIX)${TARGET_TOOLCHAIN_PREFIX}gcc
@@ -279,6 +288,9 @@ AR := $(CC_PREFIX)${TARGET_TOOLCHAIN_PREFIX}ar
MINIMAL_OBJS := $(addprefix $(OBJDIR), \
$(patsubst %.cc,%.o,$(patsubst %.c,%.o,$(MINIMAL_SRCS))))
+LABEL_IMAGE_OBJS := $(addprefix $(OBJDIR), \
+$(patsubst %.cc,%.o,$(patsubst %.c,%.o,$(LABEL_IMAGE_SRCS))))
+
LIB_OBJS := $(addprefix $(OBJDIR), \
$(patsubst %.cc,%.o,$(patsubst %.c,%.o,$(patsubst %.cpp,%.o,$(TF_LITE_CC_SRCS)))))
@@ -309,7 +321,7 @@ $(OBJDIR)%.o: %.cpp
$(CXX) $(CXXFLAGS) $(INCLUDES) -c $< -o $@
# The target that's compiled if there's no command-line arguments.
-all: $(LIB_PATH) $(MINIMAL_BINARY) $(BENCHMARK_BINARY) $(BENCHMARK_PERF_OPTIONS_BINARY)
+all: $(LIB_PATH) $(MINIMAL_BINARY) $(BENCHMARK_BINARY) $(BENCHMARK_PERF_OPTIONS_BINARY) $(LABEL_IMAGE_BINARY)
# The target that's compiled for micro-controllers
micro: $(LIB_PATH)
@@ -353,6 +365,14 @@ $(BENCHMARK_PERF_OPTIONS_BINARY) : $(BENCHMARK_PERF_OPTIONS_OBJ) $(BENCHMARK_LIB
benchmark: $(BENCHMARK_BINARY) $(BENCHMARK_PERF_OPTIONS_BINARY)
+$(LABEL_IMAGE_BINARY): $(LABEL_IMAGE_OBJS) $(LIB_PATH)
+ @mkdir -p $(dir $@)
+ $(CXX) $(CXXFLAGS) $(INCLUDES) \
+ -o $(LABEL_IMAGE_BINARY) $(LABEL_IMAGE_OBJS) \
+ $(LIBFLAGS) $(LIB_PATH) $(LDFLAGS) $(LIBS)
+
+label_image: $(LABEL_IMAGE_BINARY)
+
libdir:
@echo $(LIBDIR)
最終在 tensorflow/lite/tools/make/gen/linux_aarch64/bin 生成了 label_image 運行程序
我們將此bin文件scp到jetson nano中,同時將 tensorflow/lite/examples/label_image/testdata/grace_hopper.bmp文件scp到jetson nano中
3. 在jetson nano上運行tflite模型
首先在網絡上下載 tflite模型文件,下載網址:
將解壓後的 mobilenet_quant_v1_224.tflite 模型文件和 標籤文件 labels.txt文件傳入到jetson nano中。
所以我們需要的幾個文件具體如下:
執行下面命令識別grace_hopper.bmp文件中的相關目標物體
./label_image -v 1 -m ./mobilenet_quant_v1_224.tflite -i ./grace_hopper.bmp -l ./labels.txt
具體打印結果如下,我們可以看到會輸出使用的模型的結構以及最終識別出的物體和花費的具體時間