Unmanned Systems Technology 033 l SubSeaSail Gen6 USSV l Servo actuators focus l UAVs insight l Farnborough 2020 update l Transforma XDBOT l Strange Development REVolution l Radio telemetry focus

10 Platform one Researchers at Carnegie Mellon University (CMU) have developed a more efficient technique for training image recognition systems for driverless cars (writes Nick Flaherty). The technique uses unlabelled data from Lidar sensors with error correction for self- supervised training of neural networks. “Our method is much more robust than previous methods, because we can train on much larger datasets,” said Himangi Mittal, a researcher working with David Held, assistant professor in CMU’s Robotics Institute. Most autonomous vehicles navigate primarily using Lidar sensors that generate a 3D point cloud. Scene flow involves calculating the speed and trajectory of each 3D point. Groups of points moving together are interpreted via scene flow as vehicles, pedestrians or other moving objects. Neural network training for such a system currently uses labelled datasets, with each annotated 3D point tracked over time. Manually labelling these datasets from real-world sensor systems is laborious and expensive though, so little such data exists. As a result, scene flow training is often carried out using simulated data that can automatically label the points. This is less effective than real-world data, however, even when fine-tuned with the small amount of labelled real-world data that does exist. Instead the technique uses unlabelled data straight from the sensor to perform scene flow training coupled with error- correction algorithms. The key is the technique’s ability to detect its own errors in scene flow. At each instant, the system tries to predict where each 3D point is going and how fast it’s moving. In the next instant, it measures the distance between the point’s predicted location and the actual location of the point nearest that predicted location. This distance forms one type of error to be minimised. The system then reverses the process, starting with the predicted point location and working backwards to map to where the point originated. At this point, it measures the distance between the predicted position and the actual origination point; the resulting distance forms the second type of error. The system then works to correct those errors. “It turns out that to eliminate both of those errors, the system actually needs to learn to do the right thing – without ever being told what the right thing is,” Prof Held said. The researchers calculated that scene flow accuracy using a training set of synthetic data from a simulator was only 25%. When the synthetic data was fine- tuned with a small amount of real-world labelled data, the accuracy rose to 31%. When they added a large amount of unlabelled data to train the system using their approach, scene flow accuracy jumped to 46%. Scene system trains itself Image recognition August/September 2020 | Unmanned Systems Technology Carnegie Mellon University’s self-learning neural network algorithm can improve the accuracy of Lidar systems Memsense has used a combination of sensor architectures, characterisation, and proprietary correction algorithms to cut the cost of a MEMS high- performance IMU in half (writes Nick Flaherty). The 25 g MS-IMU3025 uses COTS sensors with 28 x 28 x 11.4 mm (1.1 x 1.1 x 0.45 in) footprint. Its gyro bias instability of 0.6 º /hour and accelerometer bias instability of 2.6 µ g  support applications from inertial navigation and control to precision platform stabilisation for half the cost of equivalent IMUs. This is achieved using hardware sensor architectures and advanced algorithms running on a floating- point processor with individual sensor characterisations. That allows low-cost sensors to be tuned more accurately in the IMU. Self-testing built into the architecture also improves the IMU’s reliability for critical applications. The accelerometer’s dynamic range configurability spans a range from ±2 g up to ±40 g . Navigation IMU sensor architecture cuts cost in half

RkJQdWJsaXNoZXIy MjI2Mzk4