Uncrewed Systems Technology 046

38 T he technology of vision systems in the uncrewed sector is facing several competing demands. In some cases, it is for low latency, responsive video streams from multiple cameras to allow a remote operator to drive the platform safely, whether in the middle of the ocean or on the surface of Mars. In others, machine learning (ML) inference has to identify elements within a video frame, either to send to an operator or for the decision-making algorithms on the platform. Yet other applications require ultra-high resolution video feeds from ever-smaller gimbals, demanding the latest video encoding technologies. Maturing silicon technology is enabling more compact video processing systems, with the latest encoders integrated into hardware to reduce the size, weight and power. Also, a new generation of graphics processor unit (GPU) is enabling more AI processing with existing ML frameworks for processing the video. New architectures for ML, both digital and analogue, are also reducing the power consumption of chips to the point where it is practical to run large frameworks in the air. UAV video systems are seeing demand for up to eight standard definition (SD) image sensors, or five if there is a mix of high definition (HD) and SD. For example, this could be using one HD sensor to zoom into images, with multiple night vision feeds. These can be implemented on a board measuring 50 x 25 mm for small gimbals with motherboard, with two daughter boards that can be swapped in and out for different types of camera. This provides either two encoding streams of 1080p60 HD video or one HD encoding channel with four SD channels for outputs using composite video, HDMI and low-voltage differential swing (LVDS). The processor uses four Cortex-A53 microprocessor cores running at 1.5 GHz with 1 Mbyte of L2 cache. That allows the chip to run operating systems such as Linux with the video codecs needed for the video decode 4Kp60 H.265 or 1080p60 H.265 as well as older codecs such as VP9, VP8 and H.264. There is also a smaller 266 MHz Cortex-M4 core for more deterministic real-time operation. The CSI-2 format from the MIPI Alliance is becoming more popular as the interface (see sidebar: Interface standards). This is a variant of a serial/deserialiser (SerDes) interface that takes a parallel interface and combines the signals onto up to four high- speed serial lines. That makes it simpler to route the serial channels, for example over a gimbal interface. Cameras with MIPI CSI-2 are able to remove the other interfaces, reducing the size of the video board, particularly in UAVs, allowing smaller gimbal designs. This also avoids issues with the LVDS interface, which has multiple variants. That means boards need to be tweaked for individual cameras, which is not an issue when using CSI-2. Remote operation For remote control of an uncrewed system, low latency is vital. The operator needs to see the video with as small a delay as possible to ensure that the control is accurate. The latency in the video system consists of several elements. There is the time taken to encode the video, the time taken to send it over the network – which could be highly variable if the internet is used – and the decoding time in the video player the operator uses. These are often tackled separately, although system developers also look at the entire signal chain to minimise the latency, using forward error correction to help maintain a steady flow of data. This consumes only a little bandwidth, adding information for repairing problems when streaming. The capture typically takes 15 ms for a 60 Hz signal from the image processor, with encoding taking 5-10 ms. That can vary depending on the complexity of the images, which does not help with providing a deterministic data flow. The Hantro codec, which is present on a number of popular chips, is a key piece of the signal chain. It is a stateless accelerator that does not need firmware to operate, making it more robust and Nick Flaherty examines the developments driving the emergence of more powerful and better quality video systems for uncrewed vehicles October/November 2022 | Uncrewed Systems Technology A robot system with low latency remote video being tested on Mt Etna for space missions (Courtesy of Skypersonic)

RkJQdWJsaXNoZXIy MjI2Mzk4