Unmanned Systems Technology 018 | CES show report | ASV Global C-Cat 3 USV | Test centres | UUVs insight | Limbach L 275 EF | Lidar systems | Heliceo DroneBox | Composites

17 Unmanned Systems Technology | February/March 2018 about 65,000 lines of extra code are added every three months. The code runs on four hardware computing platforms from Nvidia, Intel, NXP and Renesas. TomTom high- definition maps are integrated into the platform, alongside GNSS navigation from Novatel. Microsoft is providing cloud services outside China on its Azure platform. There are 84 other technology partners involved. A perception sub-module in Apollo 2.0 incorporates the ability to detect and recognise obstacles and traffic lights from different sensors. Given Lidar points and radar data, the obstacle sub- module detects, segments, classifies and tracks obstacles that are defined by the TomTom maps. It also predicts obstacle motion and position information such as heading and speed. A traffic light sub-module detects traffic lights and recognises the status of them to provide autonomous driving in a straightforward urban scene. AutonomouStuff in the US has used Apollo 2.0 to support daytime and night- time autonomous driving on urban roads using a range of Lidar and radar sensors. Baidu is also working on integrating Apollo 2.0 into production autonomous vehicles. Chinese car maker Chery Automotive will use the technology for Level 3 vehicles by 2020 using Nvidia’s Drive PX platform. Baidu also plans to mass-produce driverless buses by the end of July 2018, with Chinese bus manufacturer King Long using the technology. Intel said it is aiming to combine its automotive-grade Atom processors with the EyeQ5 image sensor and processing chips it obtained from its acquisition of Mobileye in 2017. The EyeQ5 will be able to handle 24 trillion operations per second at 10 W of power consumption, with two used for redundancy in the platform, while the 3xx4 family of Atom processors will rival the performance of Nvidia’s Xavier processor. This will provide a scalable platform for autonomous vehicles from Level 3 to full autonomy at Level 5. Intel demonstrated the first of its fleet of 100 autonomous cars using the Mobileye technology and its processors at CES this year. At the same time, it announced a deal with VW, BMW and Nissan to harvest low-bandwidth data packets from vehicles on the road. This data will be aggregated in the cloud to create HD maps through Mobileye’s Road Experience Management software. Mobileye is currently shipping the EyeQ3, with the EyeQ4 scheduled to ship in the second half of this year. That puts the EyeQ5 into 2019 for sampling, and 2020 for production systems. Aeye used the CES show to launch a laser-based ‘robotic perception system’ for autonomous vehicle, driver assistance and other mobility applications. The AE100 is a solid-state Lidar system that combined a laser sensor with a low-light camera, intelligent data collection and path planning using the company’s Idar (Intelligent Detection and Ranging) sensor. “The AE100 makes Idar technology commercially available for the first time,” said Luis Dussan. “Idar-based robotic perception allows sensors to mimic the visual cortex, bringing real-time intelligence to data collection. The system captures everything in a scene and brings higher resolution to key objects. “A key objective was to design a solid- state modular platform that is software- definable to increase reliability,” Barry Behnken added. “We transitioned from first-generation spinning Lidar hardware that allows path-planning software teams to plug and play the AE100 as customers replace their legacy systems. It requires no software changes and enables them to bring in more features over time.” The AE100 can use dynamic patterns for mapping the environment, as it is not tied to one fixed mode. The performance is software-definable by frame or region of interest, or pixels within specific frames. That gives customisable performance up to 200 Hz frame rates, less than 60 μs to detect objects, software-definable resolution of 0.09 º vertically or horizontally and a maximum range of 300-400 m. The AE100 integrates Aeye’s Idar Lidar sensor Intel showed the first of its autonomous driving test vehicles at CES 2018

RkJQdWJsaXNoZXIy MjI2Mzk4