Unmanned Systems Technology 010 | nuTonomy driverless taxi | Embedded computing | HFE International marine powertrain | Space vehicles | Performance monitoring | Commercial UAV Show Asia report

31 nuTonomy driverless taxi | Unmanned vehicle digest GPS – in fact, we really only use GPS at start-up, to get a sense of where in the city the car is located.” Instead, the taxi’s computers compare the data being collected by the Lidar sensors in real time with the dense representation of the city that nuTonomy has already built. The world model, being composed of considerable amounts of data, is stored on the cloud and streamed over a cellular connection to the vehicle as it travels. “We therefore end up with a reliable and precise localisation solution that doesn’t rely on GPS, and therefore isn’t subject to its failure modes,” says Iagnemma. Torchlight approach The public trials of the taxi are to SAE level 3, with a nuTonomy engineer riding in the driver’s seat to monitor systems, take notes and control the vehicle if required. In order to achieve SAE level 4, the task of dynamic driving must be achievable without a human driver. Key to this will be the performance of nuTonomy’s software that exploits the ever-growing world-model data. Deep-learning algorithms applied to the collective ‘perception stack’ of real- time information accumulated by the sensors enables detection, tracking and classification of objects, a technique that led the company to a ‘torchlight approach’ to perception. This involves a formal logic that allows the taxis to reason about objects they can and cannot see. There are major implications with this system. A human driver at an intersection will (generally) behave cautiously when a large truck is obscuring their view, knowing that there may be hazards beyond their line of sight that could impact their driving decisions. Autonomous vehicles do not possess human instincts or learning capabilities, but Iagnemma says, “We have technology that allows us to do exactly that – to enable the taxi to drive in a more cautious and human-like manner when it’s in an environment that’s rich with things the sensors can’t see.” The technology also allows the taxi to know when minor traffic guidelines can be ignored, in order to drive flexibly and efficiently, rather than exacerbating congestion through slow, overly- cautious driving. Trials nuTonomy continues to be in close contact with the government of Singapore about how it can expand the scope of public trials over time. At the time of writing, the trials were in their first phase, in which friends and family comprised most of the participants, and this is expected to last about two months. However, the company hopes to mediate a gradual flow into larger trial stages, as more participants apply to take part, and more of Singapore’s geography is incorporated into the taxi’s ‘test area’. Conclusion Features such as the advanced world model, Lidar-based localisation and deep-learning approaches to perception are novel solutions to problems in Singapore that are not unique. Anywhere that suffers from severe traffic congestion, unpredictable driving or concerns regarding GNSS reliability could benefit from this sensor architecture. With plans to expand through Asia, America and the Middle East following its commercial launch target in 2018, the MIT start-up is rapidly making a name for itself throughout the world of automated travel. Unmanned Systems Technology | October/November 2016 Lidar: Velodyne, IBEO Automotive Systems Radar sensors: Delphi Automotive Forward camera: Point Grey Research USB camera: Logitech IMU: Sparton Navigation and Exploration GNSS: U-blox Key suppliers to the nuTonomy Zoe The roof-mounted Velodyne HDL-32E is central to localisation and obstacle detection

RkJQdWJsaXNoZXIy MjI2Mzk4