Uncrewed Systems Technology 048 | Kodiak Driver | 5G focus | Tiburon USV | Skypersonic Skycopter and Skyrover | CES 2023 | Limbach L 2400 DX and L 550 EFG | NXInnovation NX 100 Enviro | Solar power focus | Protegimus Protection

36 attention networks or recurrent networks. So we’ve developed one of our own in-house specifically for traffic lane detection in this instance. “But we also have other, similar ones for object detection using purely camera imagery, ones where we fuse the camera and Lidar data, ones for radar alone, and radar fused with the other sensors. So again, the redundancy aspect is key to our AI philosophy as much as it is for our sensor hardware. We have numerous modalities producing multiple outputs that we can check against each other to ascertain the presence, location and heading of objects on the road.” In describing the software layers of Kodiak’s perception architecture, collectively known as Kodiak Vision, Wendel also makes a distinction between early data fusion and late data fusion. In the earlier stages of perception, the data from the Gen4’s cameras, Lidars and radars can be fused in different permutations before being fed into the various neural networks (also called ‘detectors’ in-house). In late fusion, the outputs from the detectors are fused together to construct a 4D image of the world around the truck. This contains recognisable objects such as cars and any available velocity information to intelligently inform safe avoidance trajectories where needed, as well as obstacles such as barriers that cannot be defined or categorised but nonetheless must be avoided. Computing and control systems The ability to take in all the perception data and communicate it intelligently to the downstream systems was vital to Kodiak selecting a multi-core and military- grade main computer that would be able to meet its reliability requirements. It is also responsible for some calculations for the detectors using the early fusion data, and the motion planning functions using the late fusion data. Paramount among the downstream systems is an Nvidia Drive AGX Orin, which serves as the platform for processing the autonomous perception data. While choosing the main computer was a matter of broad functionality, the Orin was chosen because of many its specifications, including its 250 TOPS of computing performance, which is key to processing high-resolution images. The control functions derived via themotion planning software are communicated to the ACEs that then control the steering, throttle, braking, lights and indicators. Although there aremultiple ECUs throughout the vehicle, the ACEs are themost important for driving the truck. Just as the SensorPods remain the same between each truck configuration, the ACEs also provide a universal actuation interface for every truck the Kodiak Driver might be installed on. “The main computer and Orin talk to each other over Ethernet, while the ACEs use the SAE J-1939 version of CAN bus, meaning we can interface with a wide variety of trucks and other automotive systems and actuators,” Wendel says. “The ACEs also connect to multiple interfaces for our safety drivers, so they can see at a glance whether they need to take control.” Both the network architecture and software algorithms were developed using C++, with C used for some of the subsystem embedded computers. And while various frameworks were used to February/March 2023 | Uncrewed Systems Technology LTE connections between the trucks enable the Sparse Maps to be consistently updated, and remote operators can step in to set new destinations or answer ‘questions’ from the truck during edge cases We bond different channels using several SIM cards from a variety of providers, and we spread our LTE modems’ signals across those

RkJQdWJsaXNoZXIy MjI2Mzk4