Uncrewed Systems Technology 050 | Reflecting on the past I AM focus I Addverb Dynamo 1T I Skyfish M6 and M4 I USVs insight I Xponential 2023 part 1 I EFT Hybrid-1x I Fuel systems focus I Ocean Business 2023 I Armach HSR

12 Platformone Researchers in Germany have created an algorithm they say can handle the ethical requirements of autonomous vehicles (writes Nick Flaherty). The researchers, at the Technical University of Munich (TUM), developed the algorithm to distribute the levels of risk fairly rather than operating on an either/or principle. The algorithm is based on the ethical risk evaluation defined by an expert panel for the European Commission in 2020. The 20 recommendations in the report include basic principles such as giving priority to road users that would be worst off in an accident and the fair distribution of risk among all road users. The team has tested about 2000 scenarios involving critical situations distributed across various types of streets and regions such as Europe, the US and China. To translate the ethical rules into mathematical calculations, the researchers classified vehicles and people moving in street traffic based on the risk they present to others and on their respective willingness to take risks. A truck for example can cause serious damage to other traffic, while in many scenarios the truck itself will suffer only minor damage. The opposite is true for a bicycle. In the next step, the algorithm doesn’t simply freeze up and abruptly apply its brakes. Yes and No options are replaced by an evaluation containing a large number of choices. “Until now, autonomous vehicles were always faced with an either/or choice when encountering an ethical decision,” said Maximilian Geisslinger, a scientist at the TUM Chair of Automotive Technology. “But street traffic can’t necessarily be divided into clear-cut, black & white situations – the countless grey shades in between have to be considered as well. “Our algorithmweighs up various risks and makes an ethical choice from among thousands of possible behaviours – and it does so in only a fraction of a second.” Franziska Poszler, a scientist at the TUM Chair of Business Ethics, said, “Traditional ethical theorieswere often contemplated to derivemorally permissible decisionsmade by autonomous vehicles. This ultimately led to a dead end, since inmany traffic situations therewas no other alternative than to violate one ethical principle. “In contrast, our framework puts the ethics of risk at the centre. This allows us to take into account probabilities to make more differentiated assessments.” The algorithmhas been validated in simulations and is available as open source code on the GitHub software platform. It is now undergoing real-world testing using a research vehicle called Edgar. Driverless vehicles ‘Ethical’ street risk code June/July 2023 | Uncrewed Systems Technology was told not to exceed a maximum acceptable risk in the various street situations. In addition, the researchers added variables to the calculation that account for responsibility on the part of the traffic participants, for example the responsibility to obey traffic regulations. Previous approaches treated critical situations on a street with only a small number of possible manoeuvres; in unclear cases the vehicle simply stopped and a remote operator took over. The risk assessment integrated in the researchers’ code results in more possible degrees of freedom and with less risk for all. For example, an autonomous vehicle wants to overtake a bicycle while a truck is approaching in the oncoming lane. All the existing data on the surroundings and the individual participants are now used. The algorithm considers whether the bicycle can be overtaken without the truck moving over to the oncoming traffic lane while maintaining a safe distance from the bicycle. What is the risk posed to each vehicle, and what risk do these vehicles constitute to the autonomous vehicle itself? In unclear cases the autonomous vehicle with the new software always waits until the risk to all participants is acceptable. That means aggressive manoeuvres are avoided, while at the same time the autonomous vehicle TUM’s algorithm distributes the levels of risk fairly, rather than using the either/or principle

RkJQdWJsaXNoZXIy MjI2Mzk4