Unmanned Systems Technology 036

64 tools. Also, our battery management is designed to distribute and save power as optimally as possible,” Tschudi notes. “Again, industrial inspections require complex flight patterns through pipes and machines. It’s a hassle for UAVs to make multiple trips to pick up where they left off, so a longer endurance reduces that hassle and the associated increased risk of collision on the way.” Survey systems The EO/IR gimbal is integrated at the front ‘equator’ of the hub, and has ±90 º of vertical tilt for videoing or photographing objects above or below the ASIO. Its position ahead of the centre of gravity also ensures that neither the cage fasteners nor its spars obstruct the centre of the viewing frame. The EO camera is a Sony true 4K device capable of 30 fps video streams and 12 MP images, and the thermal camera is a 160 x 120 model from FLIR (with an 8.7 fps video feed, the maximum permitted for exports to the US). Camera data is streamed back to the GCS over a 2.4 GHz link (with live EO video resolution capped at 1080p), to ensure that operators can see what the ASIO is surveying at all times, sometimes even BVLOS. In factories, the feed can be reflected off metal objects enough times for the GCS to capture it. “If the structure is all concrete though, such as a sewer or chimney, the signal is more likely to be absorbed or suppressed,” Tschudi explains. “In those situations we can then provide an optional range extender. “This consists of a receiver antenna that can be plugged into our GCS, and a transceiver antenna that you might place at the mouth of the chimney or sewer, to bridge the line-of-sight gap between the ASIO and its operator.” With direct line of sight, the video transmits over distances of up to 16 km. If a constant feed is impossible, both cameras will store all video recordings on board for uploading after the ASIO has been recovered, allowing them to be analysed afterwards. An array of LEDs inside of the cage allows the EO camera to be used in dark places. The lighting system is arranged equally across six panels, three each on the hub’s left and right brackets. On each bracket, one panel faces forwards, one upwards and one down, ensuring that the gimbal’s entire field of view is lit. The LEDs emit a maximum collective 10,000 lumens, consistent with a ‘flash’ function for photography, while 1000 lumens is the typical operating illumination. “We’ve separated the LEDs into panels and made their power user-configurable, partly to ensure operators can use only the amount and direction of light they need, instead of illuminating a swathe or an entire scene,” Tschudi says. Two more LEDs are used as navigational indicators – a green one on the ASIO’s right-hand side, and a red one on its left – to allow surveyors and technicians nearby to discern its heading (which is sometimes difficult owing to the ASIO’s circular shape). GNSS-denied autonomy In order to navigate through indoor areas without needing GNSS and yet minimise the risk of colliding with obstacles, the ASIO is fitted with a total of 19 sensors, including IMUs, magnetometers, barometers, IR-based distance sensors (which can work in all levels of lighting conditions) and optical flow sensors. “The sensors for distance and optical flow are pointed evenly all around the ASIO – forwards, upwards, downwards, two oriented towards the forward left and right, and two to the rear left and right,” Tschudi explains. “That enables the flight computer to sense the UAV’s surroundings, and we fuse all that data using an algorithm to output our Obstacle Repulsion function.” Obstacle Repulsion effectively creates a bubble around the ASIO. If an object breaches the bubble, the UAV automatically moves as best it can to avoid the object while continuing on its path. The radius of the bubble is set by the operator according to preference, from a few centimetres up to the maximum 10 m that the distance sensors can detect (20 m in diameter). As Flybotix anticipates the ASIO being remotely operated some of the time, but in areas or at angles where workers will not be able to clearly see its surroundings, algorithms have been integrated to autonomously assist the operator during navigation. For example, the Wall Lock feature takes data from the distance sensors to maintain an operator-set angle and distance from walls, with the flight controller adjusting the motor speeds to maintain those values. February/March 2021 | Unmanned Systems Technology Six panels of LEDs are installed to illuminate dark places to ensure useable images and videos are captured

RkJQdWJsaXNoZXIy MjI2Mzk4