Uncrewed Systems Technology 052 l Keybotic Keyper l Video encoding l Dufour Aero2 l Subsea SeaCAT l Space vehicles l CUAV 2023 report l SkyPower SP engine l Cable harnesses l Paris Air Show 2023 report I Nauticus Aquanaut

21 in the RoboCup soccer tournament for robots. He recalls being surprised at the use of a camera on the ceiling of the ‘stadium’ to localise the robot footballers, which he regarded as cheating. There was also a small number of robots in their own league, each of which had an onboard camera and a mirror that gave it a panoramic view, an arrangement that he looked on with similar disapproval. “Humans don’t play like that,” he says. “What would happen if you had a camera that could only look in the direction the robot was moving? That got me interested in trying to mimic human or animal behaviour, which in turn interested me in working on vision-based navigation.” He regards his PhD adviser, Prof Roland Siegwart, as a mentor, along with Prof Kostas Daniilidis and Prof Vijay Kumar, who were his advisers during his postdoctoral research work. One piece of advice from Prof Siegwart that has shaped his approach to his work was to start with a problem to solve, a problem that is important to industry and society. “The research questions will come from trying to solve a difficult problem,” he says. “This is different from researchers who start instead with a theoretical problem. They are both valid, but I like the pragmatic approach more.” V-SLAM pioneer One such problem was enabling UAVs to fly autonomously without GPS, using sensors that are small, light and frugal enough with energy for vehicles with small battery capacities. In 2009, after completing his PhD the previous year, Dr Scaramuzza worked with Prof Siegwart and a team of PhD students at ETH Zurich on the first demonstration of a small UAV able to fly by itself using an onboard camera, an inertial measurement unit and running an early visual simultaneous localisation and mapping (V-SLAM) algorithm. The UAV flew for just 10 m in a straight line. It was tethered to a fishing rod to comply with safety rules and was linked by a 20 m USB cable to a laptop running the V-SLAM, but it impressed the EU enough for it to fund what became the sFLY project, with Dr Scaramuzza as scientific coordinator. Running from 2009 to 2011, sFLY developed the algorithm further to enable multi-UAV operations. “The goal was to have a swarm of mini-UAVs that would fly themselves to explore an unknown environment, for example in mapping a building after an earthquake for search & rescue.” V-SLAM is now an established technique for UAV navigation in GPSdenied areas, with NASA’s Ingenuity helicopter using it to explore Mars, for example, and the sFLY project was its first practical demonstration, he says. The challenges centred on the fact that V-SLAM requires the algorithm to build a 3D map using images from the camera while working out the UAV’s position on it. “These images are very rich in information, with a lot of pixels, and you cannot process all the pixels on board,” Dr Scaramuzza says. “So we worked on a means of extracting only salient features from the images – specific points of interest. We would then triangulate those points to build a 3D map of the environment as the UAV moved through it.” Meeting the two high-level requirements of a SLAM task – building the map and working out the UAV’s position within it – is what he calls a chicken-and-egg problem. “You cannot localise if you haven’t built up a map in time, so you have to do both at the same time.” In V-SLAM, the overall task is divided into four component tasks – visual extraction and tracking, triangulation mapping, motion and then map optimisation. The purpose of map optimisation is to correct the errors that build up while the UAV is navigating, he says. “We are using information from the cameras and the inertial sensors, a gyroscope and an accelerometer, which provide angular velocities and accelerations in radians per second and metres per second squared,” he explains. “This is very useful information, because a map built by cameras alone can drift over time, especially when the UAV Davide Scaramuzza | In conversation Uncrewed Systems Technology | October/November 2023 The Swift finds its way through woodland using a camera, inertial sensors and a neural network-based AI autopilot trained in simulation

RkJQdWJsaXNoZXIy MjI2Mzk4