Unmanned Systems Technology 010 | nuTonomy driverless taxi | Embedded computing | HFE International marine powertrain | Space vehicles | Performance monitoring | Commercial UAV Show Asia report

Kontron is combining a COM-Express carrier board and modules in a ruggedised system for UAVs and ground vehicles (Courtesy of Kontron) 34 Focus | Embedded computing but allow third-party customers to add their own algorithms. The silicon implementation also has to include high-speed data interfaces. Moving large amounts of data from Lidar sensors, cameras and other sensors is an essential element of the design of an autonomous system, and there is a wide range of high- speed interfaces to support that. These are pulled out to connectors on the board to link to the sensor and control systems, and there are different ways this data can be moved from the chip to the board and then on to the rest of the system. Long-term support is a key issue for the silicon providers for embedded designs. Many processors aimed at the consumer market have a lifetime of about three years, whereas industrial applications may have support for five years, which may not be enough for many autonomous designs. Providing long-term support for devices with high temperature ranges (typically -40 to +125 C for autonomous systems in a wide range of environments) is also costly, so chip makers limit the range of processors they support for those applications. This can have an impact on the choice of processor and the choice of board, as some types of board may not have a processor that has long-term support. The API The software technology that links the computing cores with the implementation is the Application Programming Interface (API). This is an abstraction layer that allows developers to design and build the embedded software for the autonomous system, whether it be for deep learning, machine vision, image detection or real-time control, without needing knowledge of the specific design of the hardware or needing to directly access the lower-level software in the systems. The requirements of the API then feed back into the design of the computing core and so influence the implementation of the chip. Some of these APIs can be proprietary to the core designer or the chip maker, but there is an increasing push to open APIs that run on different CPUs, GPUs and DSPs so that embedded software developers can have the widest choice of hardware. For computer vision, OpenVX and OpenCL are increasingly popular APIs; OpenVX in particular has been designed to work with CPUs, GPUs and DSPs for power-efficient image processing. The latest version, v1.1, has processing functions for uses such as ‘computational photography’ – the digital capture and manipulation of images – and improves the way data is accessed and processed by higher-level applications. An open source OpenVX 1.1 sample implementation and full conformance tests have been launched to help hardware, software and algorithm developers to use the API on a wide range of cores and chips. Embedded board standards As the performance of processors has increased, autonomous system development has been evolving from large, bulky, high-performance platforms such as VPX to smaller form factors such October/November 2016 | Unmanned Systems Technology Qualcomm is using its SnapDragon processor and camera system in a development board for ground rovers (Courtesy of Qualcomm)

RkJQdWJsaXNoZXIy MjI2Mzk4