.. include:: /keyword.rst .. _feature-g350-cam-v4l2-yuv-sensor: ======================== YUV Sensor (V4L2 Sensor) ======================== .. important:: All the bash commands shown here are based on |G350-EVK|. .. contents:: Sections :local: :depth: 2 .. note:: All command operations presented in this chapter are based on the IoT Yocto v24.1, |G350-EVK-REF-BOARD| and Onsemi AP1302 ISP and AR0430 sensor. You might get different operation results depending on the platform you use. This chapter shows how to receive YUV sensor data and directly dump it to the DRAM under **V4L2 Sensor** architecture on |G350-EVK|. Camera Daughter Board --------------------- The YUV camera DTB for |G350-EVK| is ``AIoT CAM DTB D1V2``. It contains an Onsemi AP1302 ISP and an Onsemi AR0430 Sensor. The camera DTB connects with the |G350-EVK| through a miniSAS cable. For more details about the ISP and the sensor, please refer to: - `Onsemi AR0430 `_ - `Onsemi AP1302 `_ Connect The Camera to the EVK ----------------------------- There are two MIPI-CSI ports on the EVK, which means you can connect up to two cameras. .. figure:: /_asset/sw_rity_app-dev_camera_mipicsi-port.png :align: center :scale: 50% Two MIPI CSI camera ports on the EVK The hardware connection for YUV mode is ``AR0430 ---> AP1302 ---> MIPI-CSI On EVK``. The camera daughter board can be configured in CAM+ISP mode and CAM mode. In CAM+ISP mode, the images from the AR0430 camera module are passed to AP1302 ISP before being received by the EVK. In CAM mode, the images from the AR0430 camera module are transferred to the EVK directly. By default, |IOT-YOCTO| supports both, CAM+ISP and CAM mode. In this section, we will capture YUV frames using the CAM+ISP mode. First, configure camera DTB in CAM+ISP mode. You will need 6 jumpers and 1 camera sensor configured as the below figure. Make sure you have the correct jumper setting and that the camera sensor is properly connected to the CAM+ISP slot. .. figure:: /_asset/sw_rity_app-dev_camera_cam-isp-mode.png :align: center CAM+ISP mode Second, connect the camera DTB with a miniSAS cable. .. figure:: /_asset/sw_rity_app-dev_camera_miniSAS.jpeg :align: center Connect miniSAS cable After finishing the above steps, the hardware connection is complete. .. warning:: Please make sure all the above settings, including the jumpers, camera slot, and miniSAS cable, are correct. Otherwise, the I2C connection between the sensor, ISP, and SoC will fail. Select Camera Device Tree Blob Overlay -------------------------------------- The camera is inactive by default. The kernel has to load a specific device tree blob overlay to enable the camera. Please refer to the :ref:`bl33 (u-boot) section ` for more details. .. csv-filter:: YUV Camera DTBO for |G350-EVK| :header-rows: 1 :file: ../../../../_asset/tables/camera-platform-sensor-dtbo.csv :include: {0: 'Genio 350-EVK'} :exclude: {5: '[^1.* ^3.*]', 6: '[^1.* ^3.*]'} :included_cols: 0,1,2,4,7 .. prompt:: bash > auto > genio-flash --list-dtbo List of available DTBO: - camera-ap1302-ar0430-dual.dtbo - camera-ap1302-ar0430-single-csi0.dtbo - camera-ap1302-ar0430-single-csi1.dtbo - camera-ar0430-dual.dtbo - camera-ar0430-single-csi0.dtbo - camera-ar0430-single-csi1.dtbo > genio-flash --load-dtbo camera-ap1302-ar0430-dual.dtbo # Dual YUV camera on MIPI-CSI0 and MIPICSI1 > genio-flash --load-dtbo camera-ap1302-ar0430-single-csi0.dtbo # Single YUV camera on MIPI-CSI0 > genio-flash --load-dtbo camera-ap1302-ar0430-single-csi1.dtbo # Single YUV camera on MIPI-CSI1 .. warning:: Please select the DTBO according to the CSI slot to which the camera sensor is connected. For example, if the camera is connected to the ``CSI0`` slot, please load the `dtbo` ``camera-ap1302-ar0430-single-csi0.dtbo``. Otherwise, the camera initialization will fail. Supported Formats and Sizes --------------------------- .. csv-table:: :header: "Platform", "Sensor", "Stream Type", "Size", "Frame rate", "Format", "MIPI Lanes" :widths: 10, 20, 5, 5, 5, 5, 5 :align: left "|G350-EVK|", "Onsemi AP1302 & AR0430", "Preview", "2316x1746", "30", "UYVY", "4" .. note:: The supported format, resolution, and frame rate are related to the capability of the sensor and SoC. Set Camera Properties through `media-ctl` ----------------------------------------- Before using the camera, you should set the resolution and the pixel format through ``media-ctl``. All of the example in this section uses the dual camera. Onsemi AP1302 ISP with Onsemi AR0430 sensor captures the image with ``2316 * 1746, UYUV`` format. Therefore, you have to configure the format for MIPI-CSI0 and MIPI-CSI1 cameras. The media device is the key to setting up the camera. You should find the media device with the module name, ``mtk-camsys-3.0``. You can operate this media device to set the format, the resolution, and the enablement. The index of the media device may differ depending on the probing order of drivers and the number of devices. The command ``v4l2-ctl --list-devices`` can show the device information. .. prompt:: bash # auto # v4l2-ctl --list-devices mtk-camsys-3.0 (platform:15040000.seninf): /dev/media0 mtk-camsv-isp30 (platform:15050000.camsv): /dev/video0 mtk-camsv-isp30 (platform:15050800.camsv): /dev/video1 Microsoft® LifeCam Cinema(TM): (usb-11200000.xhci-2): /dev/video5 /dev/video6 /dev/media2 For example, the media device is ``/dev/media0``. Please replace ``/dev/media`` with the actual node on the platform. To set the format of the MIPI-CSI0 camera: .. prompt:: bash # auto # media-ctl -d /dev/media -V "'ap1302.2-003d':2 [fmt:UYVY8_1X16/2316x1746]" # media-ctl -d /dev/media -V "'15040000.seninf':4 [fmt:UYVY8_1X16/2316x1746]" # media-ctl -d /dev/media -V "'15050000.camsv':1 [fmt:UYVY8_1X16/2316x1746]" To set the format of the MIPI-CSI1 camera: .. prompt:: bash # auto # media-ctl -d /dev/media -V "'ap1302.3-003d':2 [fmt:UYVY8_1X16/2316x1746]" # media-ctl -d /dev/media -V "'15040000.seninf':5 [fmt:UYVY8_1X16/2316x1746]" # media-ctl -d /dev/media -V "'15050800.camsv':1 [fmt:UYVY8_1X16/2316x1746]" .. important:: Before launching the camera, you have to use ``media-ctl`` to set the format and the resolution of the camera pipeline. Otherwise, the camera pipeline may fail due to the unknown format. You can use ``media-ctl`` tool to dump the current state of the device topology. The name and the number in the above commands correspond to the entity and the pad. For more details, you can use ``media-ctl --help``. .. prompt:: bash # auto # media-ctl -d /dev/media -p Media controller API version 5.10.104 Media device information ------------------------ driver mtk-seninf model mtk-camsys-3.0 serial bus info platform:15040000.seninf hw revision 0x0 driver version 5.10.104 Device topology - entity 1: 15040000.seninf (8 pads, 4 links) type V4L2 subdev subtype Unknown flags 0 device node name /dev/v4l-subdev0 pad0: Sink [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb] <- "ap1302.2-003d":2 [ENABLED,IMMUTABLE] pad1: Sink [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb] <- "ap1302.3-003d":2 [ENABLED,IMMUTABLE] pad4: Source [fmt:UYVY8_1X16/2316x1746 field:none] -> "15050000.camsv":0 [ENABLED,IMMUTABLE] pad5: Source [fmt:UYVY8_1X16/2316x1746 field:none] -> "15050800.camsv":0 [ENABLED,IMMUTABLE] - entity 10: 15050000.camsv (2 pads, 2 links) type V4L2 subdev subtype Unknown flags 0 device node name /dev/v4l-subdev1 pad0: Sink [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb] <- "15040000.seninf":4 [ENABLED,IMMUTABLE] pad1: Source [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb] -> "15050000.camsv video stream":0 [ENABLED,IMMUTABLE] - entity 13: 15050000.camsv video stream (1 pad, 1 link) type Node subtype V4L flags 0 device node name /dev/video0 pad0: Sink <- "15050000.camsv":1 [ENABLED,IMMUTABLE] - entity 21: 15050800.camsv (2 pads, 2 links) type V4L2 subdev subtype Unknown flags 0 device node name /dev/v4l-subdev2 pad0: Sink [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb] <- "15040000.seninf":5 [ENABLED,IMMUTABLE] pad1: Source [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb] -> "15050800.camsv video stream":0 [ENABLED,IMMUTABLE] - entity 24: 15050800.camsv video stream (1 pad, 1 link) type Node subtype V4L flags 0 device node name /dev/video1 pad0: Sink <- "15050800.camsv":1 [ENABLED,IMMUTABLE] - entity 32: ap1302.2-003d (3 pads, 2 links) type V4L2 subdev subtype Unknown flags 0 device node name /dev/v4l-subdev4 pad0: Sink [fmt:SGRBG12_1X12/2316x1746 field:none colorspace:srgb crop.bounds:(0,0)/2316x1746 crop:(0,0)/2316x1746] <- "ar0430 0":0 [ENABLED,IMMUTABLE] pad2: Source [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb crop.bounds:(0,0)/2316x1746 crop:(0,0)/2316x1746] -> "15040000.seninf":0 [ENABLED,IMMUTABLE] - entity 36: ar0430 0 (1 pad, 1 link) type V4L2 subdev subtype Sensor flags 0 device node name /dev/v4l-subdev3 pad0: Source [fmt:SGRBG12_1X12/2316x1746 field:none colorspace:srgb] -> "ap1302.2-003d":0 [ENABLED,IMMUTABLE] - entity 42: ap1302.3-003d (3 pads, 2 links) type V4L2 subdev subtype Unknown flags 0 device node name /dev/v4l-subdev6 pad0: Sink [fmt:SGRBG12_1X12/2316x1746 field:none colorspace:srgb crop.bounds:(0,0)/2316x1746 crop:(0,0)/2316x1746] <- "ar0430 0":0 [ENABLED,IMMUTABLE] pad2: Source [fmt:UYVY8_1X16/2316x1746 field:none colorspace:srgb crop.bounds:(0,0)/2316x1746 crop:(0,0)/2316x1746] -> "15040000.seninf":1 [ENABLED,IMMUTABLE] - entity 46: ar0430 0 (1 pad, 1 link) type V4L2 subdev subtype Sensor flags 0 device node name /dev/v4l-subdev5 pad0: Source [fmt:SGRBG12_1X12/2316x1746 field:none colorspace:srgb] -> "ap1302.3-003d":0 [ENABLED,IMMUTABLE] You can also dump the media graph for a better view by ``media-ctl --print-dot``. The graph shows the connection between each camera subdevice. As the figure shows below, there are two AR0430 sensors connected to two AP1302 ISPs. ``seninf`` receives the data from two ISPs and sends it to two ``camsv`` respectively. .. figure:: /_asset/sw_rity_app-dev_camera_media-device.svg :align: center The media graph of camera subsystem .. _i350-evk-multi-cam-support: Multi-Camera Support --------------------- If you connect two camera daughter boards and one USB camera to the |G350-EVK|, you can launch three cameras simultaneously. After setting the camera format and finding out the video device node, use the following command to launch cameras. For example, according to the information got by ``v4l2-ctl --list-devices``, the nodes of two MIPI-CSI cameras are ``/dev/video0`` and ``/dev/video1``. The node of the USB camera is ``/dev/video5``. Please replace ``/dev/video`` with the actual node on the platform. .. prompt:: bash # auto # # MIPI-CSI 0 # gst-launch-1.0 v4l2src device=/dev/video ! video/x-raw,width=2316,height=1746,format=UYVY ! v4l2convert output-io-mode=dmabuf-import ! video/x-raw,width=400,height=300 ! fpsdisplaysink video-sink=waylandsink sync=false & # # MIPI-CSI 1 # gst-launch-1.0 v4l2src device=/dev/video ! video/x-raw,width=2316,height=1746,format=UYVY ! v4l2convert output-io-mode=dmabuf-import ! video/x-raw,width=400,height=300 ! fpsdisplaysink video-sink=waylandsink sync=false & # # USB Camera # gst-launch-1.0 v4l2src device=/dev/video io-mode=mmap ! v4l2convert ! video/x-raw,width=400,height=300 ! fpsdisplaysink video-sink=waylandsink sync=false & You will see there are three cameras shown on Weston display. .. figure:: /_asset/sw_rity_app-dev_camera_multi-camera.gif :align: center Show multi-camera through GStreamer Troubleshooting --------------- Low Frame Rate ^^^^^^^^^^^^^^ If you found that some frames are dropped and the frame rate is very low when using GStreamer to open the camera. You can use the GStreamer element ``fpsdisplaysink`` to calculate the frame rate. For more details, please refer to `fpsdisplaysink `_. Here are the examples: - To calculate the frame rate from ``v4l2src``: .. prompt:: bash # auto # gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=2316,height=1746,format=UYVY ! fpsdisplaysink video-sink=fakesink sync=false - To calculate the frame rate from ``v4l2convert``: .. prompt:: bash # auto # gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=2316,height=1746,format=UYVY ! v4l2convert output-io-mode=dmabuf-import ! video/x-raw,width=400,height=300 ! fpsdisplaysink video-sink=fakesink sync=false - To calculate the frame rate from ``waylandsink``, and show it on the screen: .. prompt:: bash # auto # gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=2316,height=1746,format=UYVY ! v4l2convert output-io-mode=dmabuf-import ! video/x-raw,width=400,height=300 ! fpsdisplaysink video-sink=waylandsink sync=false Some possible reasons cause low frame rate: 1. ``videoconvert`` is too slow to transform images on time. **Solution:** Replace ``videoconvert`` with ``v4l2convert``, which uses hardware converter ``MDP``. 2. ``waylandsink`` tries to synchronize the timestamp with the clock. **Solution:** Add ``sync=false`` property to ``waylandsink``. 3. ``v4l2convert`` is slow when using ``mmap`` memory method. **Solution:** Add ``output-io-mode=dmabuf-import`` property to ``v4l2convert``. Wrong Image Texture ^^^^^^^^^^^^^^^^^^^ If you found that the images from the pipeline are wrong, e.g. noise point, weird color, etc. You can dump the images from each pipeline element and check whether the output is correct or not. The GStreamer element ``filesink`` writes incoming data to a file in the local file system. For more details, please refer to `filesink `_. Here are the examples: - To dump the images from ``v4l2src``: .. prompt:: bash # auto # gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=2316,height=1746,format=UYVY ! filesink location=test.bin - To dump the images from ``v4l2convert``: .. prompt:: bash # auto # gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=2316,height=1746,format=UYVY ! v4l2convert output-io-mode=dmabuf-import ! video/x-raw,width=400,height=300 ! filesink location=test.bin By dumping the image, you can find out which element acts incorrectly. GStreamer Error ^^^^^^^^^^^^^^^ For other GStreamer pipeline issues, you can refer to `GStreamer Debugging Tools `_. The page contains some useful debug methods provided by the GStreamer framework. Camera Probe Failure ^^^^^^^^^^^^^^^^^^^^ If you find that, both video and media devices are created, but using the command ``media-ctl`` to set properties fails. .. prompt:: bash # auto # ls -l /sys/bus/media/devices/ | grep seninf lrwxrwxrwx 1 root root 0 Sep 20 10:43 media0 -> ../../../devices/platform/soc/15040000.seninf/media0 # ls -l /sys/class/video4linux/ | grep seninf lrwxrwxrwx 1 root root 0 Sep 20 10:43 video0 -> ../../devices/platform/soc/15040000.seninf/video4linux/video0 lrwxrwxrwx 1 root root 0 Sep 20 10:43 video1 -> ../../devices/platform/soc/15040000.seninf/video4linux/video1 # media-ctl -d /dev/media0 -V "'ap1302.2-003d':2 [fmt:UYVY8_1X16/2316x1746]" Unable to setup formats: No such file or directory (2) Moreover, ``media-ctl -d /dev/media0 -p`` doesn't show any format information. For example: .. prompt:: bash # auto # media-ctl -d /dev/media0 -p Media controller API version 5.10.104 Media device information ------------------------ driver mtk-seninf model mtk-camsys-3.0 serial bus info platform:15040000.seninf hw revision 0x0 driver version 5.10.104 Device topology - entity 1: 15040000.seninf (8 pads, 3 links) type V4L2 subdev subtype Unknown flags 0 pad0: Sink <- "ap1302.2-003d":2 [ENABLED,IMMUTABLE] pad4: Source -> "15050000.camsv":0 [ENABLED,IMMUTABLE] pad5: Source -> "15050800.camsv":0 [ENABLED,IMMUTABLE] ... This is due to the misconfiguration of hardware settings and device tree blob overlay. Please refer to the sections `Connect The Camera to the EVK`_ and `Select Camera Device Tree Blob Overlay`_. Make sure you have the correct settings.