Uncrewed Systems Technology 050 | Reflecting on the past I AM focus I Addverb Dynamo 1T I Skyfish M6 and M4 I USVs insight I Xponential 2023 part 1 I EFT Hybrid-1x I Fuel systems focus I Ocean Business 2023 I Armach HSR

In operation | Skyfish M6 and M4 58 for our cloud computing and processing the survey data to construct our digital twins,” Pilskalns says. “We might spin up as many as 50 cloud-based engines simultaneously if a customer wants a tower model turned out quickly, by which I mean in less than an hour, but it’s also the most computing-intensive and expensive option. “It means firing up a lot of high-end processor cores, so if the customer’s need isn’t time-sensitive we can do it over a couple of days at a far lower price.” Skymind is therefore not designed for processing purposes but for intelligent, smooth control of camera, gimbal and UAV all at once, including autonomous decision-making for maintaining safety during flights and deriving quality photography at the same time. In particular, Skymind autonomously aims the gimbal and focuses the camera correctly at key infrastructure targets for modelling, while handling key settings for the camera and the gimbal’s motor controllers, and governing these on the basis of incoming image data. It also performs more advanced functions such as real-time geometric projection, to speed up and improve the 3D mapping of objects in the camera’s view. In addition to optimising the data collection, Skymind handles autonomous navigation tasks, including the execution of the pre-programmed flight path and monitoring the GNSS and inertial data to ensure precise and correct guidance. Other onboard sensors for obstacle avoidance could also be used, depending on the UAV’s mission and configuration. “One reason for capturing images at the top of our camera’s resolution capabilities is that we’ve worked with Sony for a long time now on how to engineer Skymind to understand all the environmental factors that could affect image quality, and to respond rapidly in flight by adjusting the UAV to compensate for those factors in the photography,” Pilskalns says. As well as integrating ARM cores and Broadcom chipsets, the onboard computing currently comprises 19 PCBs (besides the main motherboard), including boards for the smart BMS, for interfacing with the gimbal computer, and other subsystems. The main motherboard carries the flight control systems as well as the radios and processors for RTK-GNSS and data links. “We constantly monitor how well Skymind handles customers’ tasks,” Pilskalns says. “We release software updates to tune performance, and although we’ll periodically make modified versions, say for customers who want to alpha-test some special capabilities, whole new versions of Skymind tend to be built around a 2-year development cycle. That’s how long it takes before we decide it’s worth investing in switching to next-generation processors and other electronics. “Between new versions though, we still have a lot of flexibility thanks to controlling the stack. For example, a government agency came to us recently with a specific sensor they wanted on the M6 which no-one had ever integrated before, along with an automated return-to-base capability to execute in response to a single motor failure. They also needed very stable flight during that automated return, and all of that needed to be ready for a competition 4 weeks down the line. “We were able to do it though, and win the competition and the business that followed from it.” Real-time data Although the digital twins are not produced in real time, Skyfish notes that more and more customers want real-time feedback from their photogrammetry. That is natural, given the need to inspect and repair industrial infrastructure nationwide, as well as the time constraints that asset owners must abide by for customers such as utilities service providers. Real-time monitoring and data processing is therefore still a critical part of the company’s UASs. “For faster real-time processing, one thing we’ll do is install an encoder on the gimbal, which is connected to the camera via an HDMI cable, to process the camera feed and send it to the pilot so that they can always see what the M6 or M4 are seeing, even on a fully autonomous mission,” Pilskalns explains. “It’s just to make sure things are going well, and really high-resolution video is critical for them in that regard.” This direct connection is vital to June/July 2023 | Uncrewed Systems Technology IR sensors as well as Lidars and hyperspectral cameras can be integrated for additional data layers in the digital twins

RkJQdWJsaXNoZXIy MjI2Mzk4