Issue 54 Uncrewed Sytems Technology Feb/Mar 2024 uWare uOne UUV l Radio and telemetry l Rheinmetall Canada medevacs l UUVs insight DelltaHawk engine l IMU focus l Skygauge in operation l CES 2024 report l Blueflite l Hypersonic flight

(LBL), as well as sonar-based simultaneous localisation and mapping (SLAM), in favour of a stereo camera-based point cloudgeneration approach, which will eventually be used as a visual-inertial form of SLAM. “To make a system that can work anywhere without any specific infrastructure, we couldn’t rely on USBL or LBL, because then you have to deploy the beacons for acoustically triangulating the AUV’s position, retrieve them afterwards, and somehow avoid all the kinds of multipathing issues that happen in ports and affect your mission-critical positioning accuracy,” Arteaga says. “Meidi and I do have experience in SLAM algorithms, and those can be installed in and used via our Jetson Orin SoC, although camera-based underwater SLAM is unusual compared with sonarbased subsea SLAM. But, as mentioned, using cameras over sonar was eventually a no-brainer for us to detect things like cracks in pipes at shallow depths.” While the use of cameras may raise questions over how the uOne handles poor visibility conditions, the inertial navigation typically takes over to ensure reliability when visual input is limited. The robot can also recognise and decide when conditions prevent visual data acquisition at the standards held by industry for acceptable quality of data acquisition. In such instances, it will send a message to the user that the water quality is not ideal and it will not proceed to the data acquisition waypoints. Additionally, the robot’s modularity allows for future enhancements, and uWare is considering the integration of sonar systems to improve performance in low-visibility environments. Stereo vision Initially, Arteaga and Garcia developed a SLAM algorithm for use with a single camera, which benefits from simplicity, but lacks the ability to gauge the scale of objects being viewed without complex additional motion-reference subroutines, particularly when underwater refraction can make objects appear larger than they really are. Combining this mono-camera SLAM with the IMU-DVL provided good results in tests, but it reached a limit of usefulness when trying to localise in confined areas (and the ability to survey with extreme precision inside underwater compartments was one of uWare’s key aims). Instead, the company developed its stereo vision module, through which a navigable point cloud is created in real time. This point cloud is currently used for obstacle detection and to avoid collisions by adapting the AUV’s trajectory in real time. In future, the point cloud will be used to generate a map of the AUV’s surroundings, which will then be used to locate itself by merging that information with the IMU-DVL sensor data. At that point, the system will fit the company’s envisioned visual-inertial SLAM model. After researching commercially available stereo vision systems, uWare purchased a module for testing purposes. After validating the suitability of the solution using that commercial module, the company decided to implement its own stereo vision module to answer its specific needs, being the compatibility with the rest of the system, an increased depth estimation accuracy and The uOne primarily uses stereo cameras for visual navigation, although its IMU and DVL cover for them in poor visibility conditions uOne | Dossier Download the whitepaper for free! How UAV/AAM leaders can challenge the sky without reinventing the wheel. Aviation experts reveal what makes the ideal key component supplier. VOLZ Servos Reliability for progress