Building Library from Source Code

Instead of inferencing with default libraries, users can build the libraries with customized configurations in the application SDK. Follow the following steps to get GstInference running on your platform:

Install Application SDK

All the package installation should be finished in Application SDK. After installing the SDK and completing the required environment setup, you can start to build the GstInference with Tensorflow-Lite backend support.

Build Tensorflow-Lite with ExternalDelegate

R2Inference TensorFlow Lite backend depends on the C/C++ TensorFlow API. The installation process consists of downloading the source code, building and installing it.

Important

We only provide the installation demonstration for Tensorflow v2.6.1. If users want to build Demo App with a newer version of Tensorflow source code, they should solve the potential dependency problem of the required packages, R2Inference, GstInference and Tensorflow by themselves.

  1. Download Tensorflow source code and the dependencies:

git clone -b v2.6.1 https://github.com/tensorflow/tensorflow
cd /PATH/TENSORFLOW/SRC/tensorflow/lite/tools/make     # Change /PATH/TENSORFLOW/SRC/ to your tensorflow path

# Download dependencies:
./download_dependencies.sh

2. Enable External Delegate on TensorflowLite v2.6.1 Default configuration in CMakeLists.txt does not include the ExternalDelegate source, add the following code snippet to enable it.

# Enter the tflite directory
cd /PATH/TENSORFLOW/SRC/tensorflow/lite/

# Add following code snippet to CMakeLists.txt
add_library(${TFLITE_DELEGATES_EXTERNAL_SRCS})
populate_tflite_source_vars("delegates/external"
TFLITE_DELEGATES_EXTERNAL_SRCS
    FILTER ".*(_test|_tester)\\.(cc|h)"
)

3. Configure for dependency package There are some missing dependencies required for TensorflowLite v2.6.1, but not included in the CMakeLists.txt

# Add following code snippet to include required source
add_library(
${TFLITE_SOURCE_DIR}/tools/make/downloads/flatbuffers/src/util.cpp
${TFLITE_SOURCE_DIR}/tools/make/downloads/fft2d/fftsg.c
${TFLITE_SOURCE_DIR}/tools/make/downloads/fft2d/fftsg2d.c
${TFLITE_SOURCE_DIR}/tools/make/downloads/farmhash/src/farmhash.cc
)

set(TFLITE_GOOGLETEST_DIR "${TFLITE_SOURCE_DIR}/build/googletest/googletest/include")
set(TFLITE_GMOCK_DIR "${TFLITE_SOURCE_DIR}/build/googletest/googlemock/include/")

# Modify following code snippet to include defined path variable
set(TFLITE_INCLUDE_DIRS
  "${TENSORFLOW_SOURCE_DIR}"
  "${TFLITE_FLATBUFFERS_SCHEMA_DIR}"
  "${TFLITE_GOOGLETEST_DIR}"                                                   # include googletest
  "${TFLITE_GMOCK_DIR}"                                                        # include gmock
)

# Modify following code snippet to correct sourece path
populate_tflite_source_vars("experimental/ruy"                                 # Change to "build/ruy"
  TFLITE_EXPERIMENTAL_RUY_SRCS
  FILTER
  ".*(test(_fast|_slow|_special_specs|_overflow_dst_zero_point))\\.(cc|h)$"    # bypass some test scripts
  ".*(benchmark|tune_tool|example)\\.(cc|h)$"
)
populate_tflite_source_vars("experimental/ruy/profiler"                        # Change to "build/ruy/profiler"
  TFLITE_EXPERIMENTAL_RUY_PROFILER_SRCS
  FILTER ".*(test|test_instrumented_library)\\.(cc|h)$"
)

4. Cross-compile with CMake Please refer to Application SDK : Cross-compiling with CMake section to create a toolchain file. After creating the toolchain file. Run the following command to build Tensorflow-Lite. A static library libtensorflow-lite.a will be generated in the current directory.

cd /PATH/TENSORFLOW/SRC/tensorflow/lite/
mkdir build
cd build
cmake -DCMAKE_TOOLCHAIN_FILE=../cmake/toolchain-yocto-mtk.cmake -DCMAKE_BUILD_TYPE=Debug -DTFLITE_ENABLE_XNNPACK=OFF ..
cp libtensorflow-lite.a $SDKTARGETSYSROOT/lib64    # for compiling R²Inference

Build R²Inference

  1. Download R²Inference source code

git clone https://gitlab.com/mediatek/aiot/rity/r2inference.git
  1. Modify r2i-tflite build configuration to enable Tensorflow-lite backend

vim r2inference/r2i/tflite/meson.build
# Revice the following code snippet
include_directory : [configinc]

# to
include_directory : [configinc, '/PATH/TENSORFLOW/SRC', '/PATH/TENSORFLOW/SRC/tensorflow/lite/build/flatbuffers/include']
  1. Compile R²Inference

# Under r2inference/ directory
meson build -Denable-tflite=true -Denable-tests=disabled -Denable-docs=disabled
mkdir r2library
ninja -C build # Compile the project
DESTDIR=r2library ninja -C build install # Install the library

Note

R²Inference can also compile with tensorflow-lite shared library. Modify the configuration in r2inference/meson.build to include libtensorflowlite.so. tensorflow_lite = cpp.find_library('tensorflow-lite', required: true) #Change tensorflow-lite to YOUR_SHARE_LIBRARY_NAME, i.e. libtensorflowlite After modification, remember to rerun the commands in step 3 to build and install with correct library.

  1. Move the installed libraries and headers to the system library folder so that GstInference can build smoothly

# Under r2inference/ directory
cp r2library/usr/local/lib/libr2inference*  $SDKTARGETSYSROOT/usr/lib64
cp r2library/usr/local/lib/pkgconfig/r2inference-0.0.pc $SDKTARGETSYSROOT/usr/share/pkgconfig
mkdiir -p $SDKTARGETSYSROOT/usr/local/include
cp -r r2library/usr/local/include/r2inference-0.0  $SDKTARGETSYSROOT/usr/local/include

Build GstInference

  1. Download GstInference source code:

git clone https://gitlab.com/mediatek/aiot/rity/gst-inference.git
  1. Disable gtkdoc documentation which is not supported in Application SDK

vim gst-inference/docs/plugins/menson.build
#Revice the following code snippet

inference-plugin-1.0',
  main_sgml : '@0@/gst-inference-plugin-docs.sgml'.format(meson.current_build_dir()),
  src_dir : ['@0@/ext/'.format(meson.source_root()), meson.current_build_dir()],
  gobject_typesfile : 'gst-inference-plugin.types',
  #content_files : [version_entities],
  dependencies : [plugin_deps],
  install : true)   #Change to false
  1. Compile GstInference

meson build
mkdir gstlibrary
ninja -C build # Compile the project
DESTDIR=gstlibrary ninja -C build install # Install the libraries in gstlibrary

Install on Target Board

  1. Copy necessary libraries generated in compiling R²Inference and GstInference to the library directory on target board

#In this case, we use USB flash drive to transfer file between Host and Target(i.e. Genio 350-EVK, Genio 1200-DEMO)
#Run following command on Host to package required libraries
mkdir -p necessary_lib/
cp  r2inference/r2library/usr/local/lib/(!pkgconfig) necessary_lib/   # pkgconfig file is unnecessary for target machine
cp -r gstinference/gstlibrary/usr/local/lib/(!pkgconfig) necessary_lib/
tar -czpvf necessary_lib.tgz necessary_lib/

Here is the folder structure of necessary_lib:

necessary_lib
├── gstreamer-1.0
│   ├── libgstinference.so
│   ├── libgstinferenceoverlayplugin.so
│   └── libgstinferenceutils.so
├── libgstinference-1.0.so
├── libgstinference-1.0.so.0
├── libgstinference-1.0.so.0
├── libgstinferencebaseoverlay-1.0.so.0
├── libr2inference-0.0.a
├── libr2inference-0.0.so
├── libr2inference-0.0.so.0
└── libr2inference-0.0.so.0.11.0

2. (Optional) Then, Copy files from Docker Container to the Local Machine If you install SDK in a Docker image, following steps give you a hint to transfer files from container to local

# In Local Machine
docker ps -a # to get docker container ID, in my case is 870615e417fe
docker cp 870615e417fe:/PATH_TO_NECESSARY_LIB/necessary_lib.tgz .

# plug in USB flash drive
lsblk # to get get device name, in my case is sda
mount /dev/sda1 /mnt
cp necessary_lib.tgz /mnt
# umount before you unplug USB
umount /dev/sda1
  1. Install the libraries on target machine(i.e. Genio 350-EVK, Genio 1200-DEMO)

# entr lib directory
cd /usr/lib64

# plug in USB flash drive
mount /dev/sda1 /mnt
cp /mnt/necessary_lib.tgz .
tar -zxvf necessary_lib.tgz .
mv necessary_lib/lib* .
mv necessary_lib/gstreamer-1.0/* ./gstreamer-1.0