Unmanned Systems Technology 010 | nuTonomy driverless taxi | Embedded computing | HFE International marine powertrain | Space vehicles | Performance monitoring | Commercial UAV Show Asia report

27 August 25, 2016, nuTonomy announced it was starting the world’s first public trial of a driverless taxi fleet consisting of six taxis and developed by modifying existing production electric automobiles. The workhorse of the fleet is based on the Renault Zoe, a five-door hatchback powered by a 22 kWh lithium-ion battery and 87 bhp electric motor with a range of 240 km. The modifications made by nuTonomy are considerable, as CEO Karl Iagnemma explains. “We have upgraded power systems to support the sensors we’ve installed for autonomous navigation and localisation, improved the computing capabilities – which go quite a bit beyond traditional automotive-grade CPUs – to run the software, and we’ve integrated additional safety capabilities to enable things like emergency stopping. “Those are things that go into taking it from a ‘stock vehicle’, to what we call an AV-ready vehicle. We’ll be expanding to probably a dozen Zoes over the following few months in 2016.” If nuTonomy’s fleet of taxis is to expand, however, the trials must prove to the public, the government of Singapore and all the other stakeholders that the vehicles are fully capable of identifying and assessing all relevant information presented to them while driving, before reacting to that information with consistently optimal decisions that prioritise the safety of passengers, motorists and pedestrians at all times – without hampering their ability to conduct journeys in a timely fashion. Sensors Developing and installing a comprehensive sensor architecture was the first part of nuTonomy’s concept of a safe ‘mobility on demand’ driverless taxi, to accumulate and assess the vast quantities of data needed in real time to autonomously reach sound driving decisions. “We use a combination of vision, radar and Lidar to track and predict the motion of moving objects such as other cars, people and cyclists, and stationary objects such as parked cars and other objects on the roadside,” Iagnemma says. These multiple sensor types are used for their complementary nature, for example in terms of their failure modes. Cameras lose their utility in adverse lighting conditions such as night-time or when facing into the sun. Lidar operates in day and night, but can struggle in low- visibility conditions such as extreme haze or fog, as visible targets are required for scanning. Radar is unaffected by these conditions but can suffer interference as the concentration of automotive radar sensors increases in cities with high traffic density. “But also, the detection range and resolutions of the sensors are somewhat complementary,” Iagnemma says, explaining that radar provides rich information at short distances but is limited in use over longer ranges (particularly as the autonomous Unmanned Systems Technology | October/November 2016 The Blackfly camera provides the taxi with forward line-of-sight monitoring nuTonomy driverless taxi | Unmanned vehicle digest

RkJQdWJsaXNoZXIy MjI2Mzk4