Issue 54 Uncrewed Sytems Technology Feb/Mar 2024 uWare uOne UUV l Radio and telemetry l Rheinmetall Canada medevacs l UUVs insight DelltaHawk engine l IMU focus l Skygauge in operation l CES 2024 report l Blueflite l Hypersonic flight

32 overall performance improvements in low light environments. “Like the acoustic module, we toned down from commercial solutions to something lighter-duty and more cost-effective, so we don’t need a stereo vision module running at 300 FPS [frames per second]. It’s just unnecessary when you’re working in the slow underwater world – 10-20 FPS is plenty for the uOne,” Arteaga says. “We then built a custom stereo camera by connecting two cameras to a computer to process and synchronise the two vision feeds. The first version performed as needed in tests, generating clouds of points for localisation. Integrated into the uOne, it runs Meidi’s object-detection algorithms perfectly: it recognises fish around the robot, as well as humans, rocks, pipes and other objects.” The two cameras in the stereo module are spaced 8 cm apart, each featuring a 1080p resolution, a horizontal field of view (FoV) of 62.2° (48.8° vertical) and running at 5-10 FPS. Smart navigation Future work will aim to have the AI recognise further actionable details of the objects that the uOne views, enabling smarter and more precise navigation around obstacles or closer photography of cracks in infrastructure. Garcia explains: “Computer vision has been of interest to us since starting the company, with the initial aim of getting an AUV to follow a diver underwater. But it would’ve taken a highly customised implementation of computer-vision algorithms to detect and track a diver, and I think we were running everything on a Raspberry Pi 2 or 3 at the time. “We started by using traditional computer-vision algorithms, like tracking pipes based on detecting contours and trying to recreate the shape of the pipe, and those can work, but they’re difficult to generalise, which you need to do if your AUV is going to successfully inspect all kinds of pipes everywhere. “Luckily, the longer the company existed, the more underwater imagery we could collect, both ourselves and from partner organisations, until eventually we were able to transition into machinelearning models for recognising not just pipes but other things like different species of seagrass, which is important for some prospective customers, like environmental groups,” Garcia adds. Visual-inertial fusion Combining the stereo vision with inertial and DVL navigation enables the uOne to localise itself similarly to a human walking through a large field, approximating its position and avoiding collisions via visual references of its surroundings and inertial references of its trajectory taken through them, with the visual navigation also helping to correct IMU drift, as a global navigation satellite system (GNSS) would. “Vision-inertial fusion also enables it to hunt intelligently for visual tags, such as identification markers unique to one specific ship parked among many others, or physical changes in a pipe’s support structures, compared to what the uOne viewed in or around each support structure the last time it inspected them,” Garcia notes. “If it views those sorts of changes, it can stop and take hundreds or even thousands of closer-up pictures of them at varying angles in order to produce a better 3D reconstruction of that spot in particular. That could be a crack in a pipe support that has grown over time, or a piece of coral in a reef that has bleached or been damaged. “Those are the sorts of things that, historically, only human divers could be trusted to do intelligently, so over time, the way operators plan missions with our robot will be like they’re interacting with a more and more experienced diver, giving it gradually fewer, more high-level commands and letting it make its own decisions.” Powerful thrust In the eight-vectored thrust configuration, all eight thrusters tend to be active across most of the uOne’s movements, which contributes to its ability to move in February/March 2024 | Uncrewed Systems Technology Future work will enable the uOne to recognise more actionable details of objects, allowing smarter navigation around them or closer photography of details valuable to end-users In September 2023, the uOne’s seagrass mapping abilities were demonstrated in the south of France with Creocean and Seaviews, mapping around 5000 m² in two days Dossier | uOne

RkJQdWJsaXNoZXIy MjI2Mzk4