Uncrewed Systems Technology 052 l Keybotic Keyper l Video encoding l Dufour Aero2 l Subsea SeaCAT l Space vehicles l CUAV 2023 report l SkyPower SP engine l Cable harnesses l Paris Air Show 2023 report I Nauticus Aquanaut

22 In conversation | Davide Scaramuzza has moved a long way from its starting point, but the accelerometer will always sense gravity when the UAV is stationary, and you can use that information to correct the drift.” Event camera potential Dr Scaramuzza and his team continue to develop and refine autonomous visual navigation for UAVs, with their current work concentrating on exploiting standard as well as developmental ‘event’ cameras. “With a standard camera you get frames at constant time intervals,” he says. “By contrast, an event camera does not output frames but has smart pixels. Every pixel monitors the environment individually and only outputs information whenever it detects motion.” Because they don’t generate a signal unless they detect change, event cameras are much more economical with bandwidth and computing power than standard frame cameras. And because they don’t accumulate photons over time, they are immune to motion blur, he explains. Eliminating motion blur will enable small UAVs to use vision-based navigation at much higher speeds, making them more robust to failures in the propulsion system, for example. It can even allow them to dodge objects thrown at them, he says, and these last two benefits have been demonstrated experimentally by Dr Scaramuzza’s team. In the race against the champions, the Swift quadcopter used a conventional frame camera (an Intel RealSense D450) and an IMU, with a neural network running on an Nvidia Jetson TX2 GPU-based AI computer processing their inputs in real time and issuing the manoeuvre commands. The race took place on a course consisting of a series of gates through which the UAVs had to fly in the correct order. Training the AI autopilot In the Swift, the neural network takes the place of the perception, planning and control architecture that has dominated robotics for 40 years. In most UAVs that use vision-based navigation, the perception module runs the V-SLAM software. “The problem is that it is a bit too slow, and it is very fragile to imperfect perception, especially when you have motion blur, high dynamic range and unmodelled turbulence, which you get at high speeds,” he says. “So we took a neural network and tried to train it to do the same job as perception, planning and control. The challenge there is that you need to train a neural network.” Doing that in the real world with a team of expert pilots was deemed impractical because of the downtime for battery changes and crash repairs. So using a reinforcement learning algorithm, the Swift was trained in virtual environments developed by the games industry. “Reinforcement learning works by trial and error,” he says. “We simulated 100 UAVs in parallel that are trying to learn to fly through the gates as fast as possible so that they reach the finish line in the minimum time. They took hundreds of thousands of iterations, which converged in 50 minutes. A year has passed since the race, and we can now do it in 10 minutes.” Although the AI proved much faster than the human champions, Dr Scaramuzza does not yet claim it is better than human pilots, as there is still a lot to learn. For example, the AI cannot October/November 2023 | Uncrewed Systems Technology Cautious navigation through a gap in a damaged concrete wall, a manoeuvre typical of many urban search & rescue tasks after disasters

RkJQdWJsaXNoZXIy MjI2Mzk4