AI Demo App
IoT Yocto has integrated two open-source gstreamer streaming pipeline frameworks for complex neural network applications.
It is an open-source project from Ridgerun Engineering that provides a framework for integrating deep learning inference into GStreamer. Either use one of the included elements to do out-of-the-box inference using the most popular deep learning architectures or leverage the base classes and utilities to support your custom architecture.
This repo uses R²Inference, an abstraction layer in C/C++ for a variety of machine learning frameworks. With R²Inference a single C/C++ application may work with models on different frameworks. This is useful to execute inference taking advantage of different hardware resources such as CPU, GPU, or AI optimized accelerators.
It is an open-source project which was initially developed by Samsung and then transferred to LF AI Foundation as an incubation project. It proivdes a set of Gstreamer plugins that allow Gstreamer developers to adopt neural network models easily and efficiently and neural network developers to manage neural network pipelines and their filters easily and efficiently.
NNStreamer provides the new Gstreamer stream data type and a set of Gstreamer elements (plugins) to construct media stream pipeline with neural network models. It is well documented through its online document site and it supports well-known neural network frameworks including Tensorflow, Tensorflow-lite, Caffe2, PyTorch, OpenVINO and ARMNN.
IoT Yocto have provided a new
tensor_filterfor Neuron SDK. Users can use
tensor_filter_neuronsdkto create gstreamer media stream pipeline and leverage Genio platform’s powerful AI hardware accelerator, such as MDLA. You can find the implementation of the
tensor_filter_neuronsdkin IoT Yocto NNStreamer source (
For a detailed introduction and usage guide of these frameworks, please refer to the: