Uncrewed Systems Technology 052 l Keybotic Keyper l Video encoding l Dufour Aero2 l Subsea SeaCAT l Space vehicles l CUAV 2023 report l SkyPower SP engine l Cable harnesses l Paris Air Show 2023 report I Nauticus Aquanaut

10 Can you take the wheel? Researchers have announced a method to determine whether a driver can take back control of a vehicle in self-driving mode (writes Nick Flaherty). Research found that people’s attention levels and how engrossed they are in onscreen activities can be detected from their eye movements. This provides a new way to determine the readiness of drivers using the self-driving mode to respond to real-world signals, such as takeover requests from the car. When using the self-driving mode, drivers are able to take their hands off the wheel and participate in other activities. However, current models can require the driver to take back control of the car at certain points. For example, they can use the selfdriving mode during a traffic jam on a motorway. But once the jam has cleared and the motorway allows speeds faster than 40 mph, the AI will send a ‘takeover’ signal to the driver, indicating that they must return to full driving control. The researchers tested whether it was possible to detect if a person was too engrossed in another task to respond swiftly to such a takeover signal. The team, at University College London, tested Angoka, in Belfast, has developed a security system specifically for fleets of autonomous vehicles (writes Nick Flaherty). The system starts with a physically unclonable function (PUF), a unique ID assigned to a particular vehicle. This is generated from a set of parameters that cannot be recreated, and means that machine learning model on this data and found they could predict whether the participants were engaged in an easy or demanding task based on their eye movement patterns. “Our findings show it is possible to detect the attention levels of a driver and their readiness to respond to a warning signal, just from monitoring their gaze pattern,” said Prof Nilli Lavie at the UCL’s Institute of Cognitive Neuroscience. “Even when they are aware that they should be ready to stop their task and respond to tones as quickly as they can, they take longer to do so when their attention is held by the screen. Our research shows that warning signals might not be noticed quickly enough in such cases,” he added. securely form an autonomous vehicle to a secure service on Amazon Web Services. The key length is optimised for the network, as a low-bandwidth network such as LoRaWan would be overwhelmed by a long key length. The key can be changed on a schedule set by the operator, ranging from every minute to hours or days depending on the level of security required. Driverless cars Security Vehicle fleets guardian 42 participants across two experiments, using a procedure that mimicked a takeover scenario in a driverless car. Participants were required to search a computer screen with many coloured shapes for some target items and let their gaze linger on targets to show they had found them. They found that when a task demanded more attention, participants took longer to stop watching the screen and respond to a tone. The tests showed it was possible to detect participants’ attention levels from their eye movements. An eye movement pattern involving longer gazes and shorter distances of eye travel between the target items indicated that the task was demanding more attention. The researchers also trained a even if a comms module is moved from one vehicle to another it will not work, avoiding the problem of spoofing. This PUF is used to generate a symmetric quantum key using a random number generator. Angoka has developed a technique to share this key over Ethernet or wireless links; previously a fibre optic connection was necessary. This key then allows data to be sent October/November 2023 | Uncrewed Systems Technology The eye-tracking method determines the ability of drivers to revert to manual control in a self-driving car

RkJQdWJsaXNoZXIy MjI2Mzk4