UST035

98 PS | Micro-UAV visual navigation N avigation and collision avoidance inside buildings and other GNSS-denied environments are hard for micro-UAVs, which is why a Chinese patent awarded in late September last year for a UAV complete with sensor suite and navigation system is intriguing – particularly as its inventors expect the quadcopter to weigh less than 250 g in total, and potentially under 50 g (writes Peter Donaldson). Established GNSS-denied navigation techniques such as simultaneous localisation and mapping (SLAM) rely on sensors that are too bulky and heavy for the smallest UAVs, and run complex software that needs a lot of computing power, adding more weight. The dominant techniques combine visual inertial odometry (VIO) or laser visual inertial odometry (LVIO) with SLAM. VIO estimates the UAV’s movement in both translation and rotation with respect to a frame of reference by interpreting a sequence of images from monocular, binocular or RGB-D cameras. The latter are digital cameras that augment their images with information on the distance of individual pixels from the sensor. In LVIO, additional distance information is provided by Lidar. VIO/ SLAM software relies on computationally expensive Kalman filter and/or particle filter algorithms. The Chinese design features a vision guidance computer that processes video streams from a pair of Omnivision camera modules, one a forward-looking OV2640 and the other a downward-looking OV5640. The computer itself is described as a “Haisi 3559 chip” in the patent, which probably means a Hisilicon Hi3559A system-on-chip for high-speed video processing. The system also relies on a scene memory module based on a 64 Gbyte TransFlash card that stores images of the environment, and a target template to compare with live camera imagery. It features a comms module as well, for image and data comms with the ground control system via wi-fi or Bluetooth. The vision guidance computer’s output is a set of instructions for the navigation and flight control system (FCS) that commands the UAV’s attitude, speed, position and direction. The FCS has its own set of sensors including obstacle detectors to cover sectors ahead, above, below and to the vehicle’s left and right. Forward-looking sensor options include ultrasonic or infrared devices, or a VL53L1X laser-ranging proximity sensor from STMicroelectronics. The other sectors could be covered by more of the latter, with the forward- and downward-looking sensors being provided with supplementary illumination by a 5050 LED chip. There is also a Bosch Sensortec BMI088 inertial sensor that measures the UAV’s orientation and detects motion along three orthogonal axes, as well as a Pimoroni PMW3901 optical flow chip. Outputs from these sensors are processed by a flight control computer with an integrated task memory constructed from STMicroelectronics’ STM32F 722 microcontrollers, which are based on ARM Cortex-M7 RISC chips. The enclosures for the vision guidance and flight control computers are envisaged as forming the main structural members of the quadcopter’s airframe, to which the sensors and the outriggers that support the electric motors and propellers are attached. The patent, CN211554750U, could herald a generation of micro-UAVs that can find their way safely around in environments that satellite navigation signals can’t reach, perhaps helping to save lives in the process. Now, here’s a thing “ ” December/January 2021 | Unmanned Systems Technology The FCS has its own sensors, such as obstacle detectors to cover sectors ahead, above, below and the vehicle’s left and right

RkJQdWJsaXNoZXIy MjI2Mzk4