Camera
This chapter describes the common information and instructions for the camera on IoT Yocto, such as setting camera hardware/software, launching the camera pipeline, and so on. The camera on different platforms may have some platform-specific instructions or test results. For example, you will have different camera settings on different platforms. Please refer to the platform-specific section to find more details.
Note
All cmd operations presented in this chapter are based on the IoT Yocto v22.1, Genio 350-EVK. You might get different operation result depending on the platform you use.
Video 4 Linux 2 Utility - v4l2-ctl
v4l2-ctl
is a useful tool to dump the information of v4l2 devices.
You can obtain the supported format, resolution, and controls.
For more details, you can use command v4l2-ctl -h
.
To list all available devices on the board:
v4l2-ctl --list-devices
...
mtk-camsys-3.0 (platform:15040000.seninf):
/dev/media1
mtk-camsv-isp30 (platform:15050000.camsv):
/dev/video3
mtk-camsv-isp30 (platform:15050800.camsv):
/dev/video4
USB2.0 Camera: USB2.0 Camera (usb-11200000.xhci-2):
/dev/video5
/dev/video6
/dev/media2
...
To obtain the format and the resolution supported by a video device:
v4l2-ctl -d /dev/video3 --all
...
Video input : 0 (1a051000.camsv video stream: ok)
Format Video Capture Multiplanar:
Width/Height : 2316/1746
Pixel Format : 'UYVY'
Field : None
Number of planes : 1
Flags :
Colorspace : sRGB
Transfer Function : Default
YCbCr/HSV Encoding: Default
Quantization : Default
Plane 0 :
Bytes per Line : 4632
Size Image : 8087472
GStreamer Pipeline Example
v4l2src
The camera implementation follows V4L2 standard.
Therefore, you can operate the camera through GStreamer element, v4l2src
.
For more details about GStreamer, please refer to GStreamer.
In this section, there are two scenarios demonstrated:
Show camera images on the screen
Store camera images in the file
Show camera images on the screen
First, you need to find out which device node is the camera you want.
The video device node which points to seninf
will be the camera.
In this example, the camera is /dev/video3
.
ls -l /sys/class/video4linux/ | grep seninf
total 0
...
lrwxrwxrwx 1 root root 0 Sep 20 2020 video3 -> ../../devices/platform/soc/15040000.seninf/video4linux/video3
...
Then you can show the full-size camera image to the screen through waylandsink
.
gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=2316,height=1746,format=UYVY ! videoconvert ! waylandsink sync=false
You may feel that the image is too big to be accommodated on the screen.
In this case, you can use the GStreamer element, v4l2convert
, which will use the hardware converter, MDP, to resize the image.
gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=2316,height=1746,format=UYVY ! v4l2convert output-io-mode=dmabuf-import ! video/x-raw,width=400,height=300 ! waylandsink sync=false
Store camera images in the file
To store camera images, you can use filesink
as output.
By the following command, the camera images will be saved in /home/root/out.yuv
gst-launch-1.0 v4l2src device=/dev/video3 num-buffers=1 ! video/x-raw,width=2316,height=1746,format=UYVY ! filesink location=/home/root/out.yuv
You can use filesrc
to show the saved images.
gst-launch-1.0 filesrc location=/home/root/out.yuv blocksize=8087472 ! videoparse width=2316 height=1746 format=uyvy framerate=1 ! videoconvert ! waylandsink
Encode Audio and Video to MP4 file
To encode audio and video to a MP4 file, you can use the following plugins:
Input:
audiosrc
andv4l2src
Output:
filesink
Converter:
v4l2convert
andaudioconvert
Encoder:
v4l2h264enc
andavenc_aac
Muxer:
mp4mux
.
By the following command, the 20 seconds 720P MP4 file will be saved in /home/root/out.mp4
gst-launch-1.0 -e -v v4l2src device=/dev/video3 ! video/x-raw,width=2316,height=1746,format=UYVY,framerate=30/1 ! \
capssetter replace=true caps="video/x-raw, width=2316, height=1746, framerate=(fraction)30/1, multiview-mode=(string)mono, interlace-mode=(string)progressive, format=(string)UYVY,colorimetry=(string)bt709" ! \
v4l2convert output-io-mode=5 ! video/x-raw,width=1280,height=720,framerate=30/1 ! \
v4l2h264enc extra-controls="cid,video_gop_size=30" capture-io-mode=mmap ! h264parse ! queue ! mux.video_0 \
alsasrc device=dmic ! audio/x-raw,rate=48000,channels=2,format=S16LE ! audioconvert ! avenc_aac ! aacparse ! queue ! mux.audio_0 \
mp4mux name=mux ! filesink location=/home/root/out.mp4 & \
pid=$! sleep 20 && kill -INT $!
Note
Gstreamer utilizes PTS (Presentation Timestamp) as a reference for encoding files. However, there are situations where the camera or audio data may experience latency during startup, resulting in a dummy period in the encoded file. For instance, if the camera takes 2 seconds to fully start up, the content within the first 2 seconds of the encoded file will be dummy. This situation is known to occur specifically on the Genio 1200-EVK and the Genio 700-EVK with Onsemi AP1302 ISP and AR0830 sensor. The launch time for this configuration is approximately 3 seconds. For a detailed analysis of the launch time, please refer to Sensor Launch Time.
libcameasrc
On IoT Yocto, you can also use the GStreamer element, libcamerasrc
, to demonstrate the camera pipeline.
First, you need to determine which camera you want to use. The libcamera utility cam
can help.
cam -l
Available cameras:
1: Internal front camera (/base/soc/i2c@11009000/camera@3d)
2: Internal front camera (/base/soc/i2c@1100f000/camera@3d)
3: 'USB2.0 Camera: USB2.0 Camera' (/base/soc/usb@11201000/xhci@11200000-2:1.0-1e4e:0102)
Second, select the camera you want and record its name.
For example, the name of the first camera above is /base/soc/i2c@11009000/camera@3d
.
Third, use the GStreamer command with a specified camera name to show the camera images on the screen.
gst-launch-1.0 libcamerasrc camera-name="/base/soc/i2c@11009000/camera@3d" ! video/x-raw,format=RGB ! v4l2convert output-io-mode=dmabuf-import ! video/x-raw,width=400,height=300 ! waylandsink sync=false
For more details about the GStreamer elements, libcamerasrc
, you can use gst-inspect-1.0
command to list details, templates, and properties.
gst-inspect-1.0 libcamerasrc
...
Plugin Details:
Name libcamera
Description libcamera capture plugin
Filename /usr/lib64/gstreamer-1.0/libgstlibcamera.so
...
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw
image/jpeg
Type: GstLibcameraPad
Pad Properties:
stream-role : The selected stream role
flags: readable, writable, changeable only in NULL or READY state
Enum "GstLibcameraStreamRole" Default: 2, "video-recording"
(1): still-capture - libcamera::StillCapture
(2): video-recording - libcamera::VideoRecording
(3): view-finder - libcamera::Viewfinder
...
Element Properties:
camera-name : Select by name which camera to use.
flags: readable, writable, changeable only in NULL or READY state
String. Default: null
name : The name of the object
flags: readable, writable
String. Default: "libcamerasrc0"
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
...
USB Camera
IoT Yocto supports USB Video Class (UVC). You can use a USB webcam as a v4l2 video device and operate through GStreamer. To find out the USB camera, you can use the following two methods:
For v4l2 device node
ls -l /sys/class/video4linux
...
lrwxrwxrwx 1 root root 0 Oct 8 01:29 video5 -> ../../devices/platform/soc/11201000.usb/11200000.xhci/usb1/1-1/1-1.3/1-1.3:1.0/video4linux/video5
...
For libcamera name
cam -l
Available cameras:
1: Internal front camera (/base/soc/i2c@11009000/camera@3d)
2: Internal front camera (/base/soc/i2c@1100f000/camera@3d)
3: 'USB2.0 Camera: USB2.0 Camera' (/base/soc/usb@11201000/xhci@11200000-1.3:1.0-1e4e:0102)
In this example, the video device node of the USB camera is /dev/video5
, and the camera name is /base/soc/usb@11201000/xhci@11200000-1.3:1.0-1e4e:0102
.
Next, you can operate your camera through GStreamer, given either the device node or the libcamera name.
To use
v4l2src
gst-launch-1.0 v4l2src device=/dev/video5 io-mode=mmap ! videoconvert ! waylandsink sync=false
Note
UVC driver uses CPU to compose the frame buffer from several USB packets, so the memory mode should be mmap
.
Otherwise, if the memory mode is dmabuf
, the consumer of UVC won’t flush the CPU cache leading to the dirty image issue.
To use
libcamerasrc
gst-launch-1.0 libcamerasrc camera-name="/base/soc/usb@11201000/xhci@11200000-1.3:1.0-1e4e:0102" ! videoconvert ! waylandsink sync=false