Issue 40 Unmanned Systems Technology October/November 2021 ANYbotics ANYmal D l AI systems focus l Aquatic Drones Phoenix 5 l Space vehicles insight l Sky Eye Rapier X-25 l FlyingBasket FB3 l GCS focus l AUVSI Xponential 2021

30 and Lidar construct a complete 3D model of the surroundings, and combine it with velocity information obtained via the IMU, simultaneously mapping and localising itself within that map. Using the map, the ANYmal can discern its location based on the shapes of its surroundings, with a localisation accuracy to within a few centimetres to enable close repetition of inspection points. It also selects the best shortcuts from one waypoint to the next, including when it’s outdoors, without needing GNSS. While it might be tempting to use GNSS for navigation when outside, the reflections of signals off metal walls and objects could make satellite signals unreliable to the point that it makes no sense having GNSS. “The 3D functionality of our Lidar and cameras is so important for that,” Fankhauser says. “Often, on offshore structures such as oil rigs, you’ll need to climb staircases with no walls or other points of reference around, and the ANYmal would lose itself at that point with just a 2D map and field of view. “Our localisation stack is robust enough that if you switch a robot off, carry it somewhere else in the building, and turn it on again, after a matter of seconds it can figure out where it is. No QR codes or rails are needed anywhere, and any industrial building’s features will work with it.” The SLAM aspect of the robot’s navigation also makes it useful for industrial environments that are changing, such as mines or tunnels. Its routine waypoints could take it down shafts to extend its pre-existing 3D model, to allow it to update its inspection routes as well as track the progress of such works. If it encounters a previously unforeseen obstacle in its path, it can try to walk over it (if it’s smaller than 30 cm), go around it (if at least 80 cm of free space seems to be available on one side), it can wait for the obstacle to move, or it can calculate a different path to its next waypoint if the obstacle stays where it is. No object classification takes place, as that is not necessary for its mission objectives. Comms There is an array of I/Os on the ANYmal to ensure interoperability between its main network bus and the various standard-issue subsystems, as well as the different kinds of sensors or other tools end-users might want to install. “We use cameras that generate gigabytes of data per second, and a dozen actuators that all need to deliver multiple streams of performance data with high update rates, so we need a diverse set of comms protocols to keep them all talking with each other in real time,” Dr Mauerer notes. As mentioned, EtherCAT is used for the ANYdrives. This protocol is ideal for that, as it is a ruggedised fieldbus and was originally designed with short update times and precise synchronisation of October/November 2021 | Unmanned Systems Technology The front and rear faces of the UGV feature 3D and wide-angle cameras, through which it perceives its surroundings and calculates where to place its next step The ANYmal D performs SLAM to navigate and choose routes to inspection targets, with its core software running on a six-core 8th Gen i7 CPU from Intel

RkJQdWJsaXNoZXIy MjI2Mzk4