Unmanned Systems Technology 024 | Wingcopter 178 l 5G focus l UUVs insight l CES report l Stromkind KAT l Intelligent Energy fuel cell l Earthsense TerraSentia l Connectors focus l Advanced Engineering report

78 as signal attenuation caused by the water content in the plants. That can cause RTK GNSS to drop from a 2 cm accuracy to 40 cm. That can still be enough though to localise the TerraSentia within the GCS map, and geo-tag phenotypic data, but it is insufficient for safely navigating 80 cm-wide rows. GNSS readings can still guide the TerraSentia to limited waypoints for entering and exiting the field, but in between, Lidar is used for relative navigation through rows of crops. As the TerraSentia passes an entry waypoint and enters a row of crops, one of the Lidars, on the front of the chassis, provides navigation by operating on a 270° horizontal FOV, to detect either side of the row and everything in between, ahead of the vehicle. “We have developed custom algorithms to estimate the distance from rows, and keep the vehicle down the middle, with many miles now logged with that Lidar- based navigation,” Chowdhary notes. “If it encounters an obstruction like a big rock, it stops and reports its GNSS coordinates. Typically in a field, you have to go back around the next or last row to measure on the other side of an obstacle, because there’s rarely enough space to manoeuvre around it. Over time though we expect it to learn which obstacles it can try to go over and which ones it can’t.” The machine learning for autonomous navigation and obstacle avoidance are powered by one of the i7s acting as an autonomy computer. Some of the control computations are also covered by a smaller supplemental computer, similar in architecture to the Raspberry Pi. The TerraSentia’s other Lidar is the same model as the first – a 2D Hokuyo system with a 270° FOV – but it is mounted on the rear and points upwards. This gathers phenotypic data in the ‘roll’ axis, and is nicknamed the ‘peacock tail’ Lidar by the team for the shape of its swathe. This Lidar is used in TerraSentia’s ‘structure from motion’ (SfM), an optical flow-like technique relying on the forward movement of the UGV to build 3D shape information using the rapid 2D measurement updates. “When you go out and measure plants, there’s a human bias – plant height is a ‘fuzzy’ thing,” Chowdhary says. “One person could measure it totally differently from another, and the same goes for plant width, which is actually far harder to measure accurately across the breadth of the plant.” As it moves forwards and generates 3D maps of each row, the TerraSentia’s upward-pointing Lidar could be collecting data on plant count, emergence (how many plants have sprouted) and plant height. Two of the cameras, one on each side, point outwards and, using embedded computer vision algorithms, combine with the Lidar’s SfM to give accurate measurements of plant stem width. That is a particularly important biological marker for the health of corn and sorghum. “They are 1080p cameras, but we don’t need to run them at 1080p because we’re already so close to the plant that we don’t necessarily need such high resolution,” Chowdhary notes. The third camera is atop the vehicle and points upwards, and uses a fisheye lens to capture the whole canopy in one view. It measures how much sunlight infiltrates the canopy against how much leaf area blocks the light, which is a measure of canopy density. That in turn can be used to measure the biomass. “All four of these phenotypes – plant count, height, width and leaf area index – directly correlate with biomass, and biomass directly correlates with farming yield,” Chowdhary explains. “On top of these, we can also monitor for more immediate concerns such as weeds, disease markers, stress markers and other things that can matter more to growers than breeders.” As this data is collected, a breeder could be walking behind the vehicle, collecting more qualitative types of information such as plant vigour or subtle differences in growth stages. When the TerraSentia reaches the end of a row, GNSS is typically once again good enough to guide it around to the next row. The wheels turn using skid-steering, with the left wheels mechanically independent of the right, all controlled by algorithms similar to those used by a UAV. Originally, there had been requests by February/March 2019 | Unmanned Systems Technology Rather than using a 3D Lidar, the UGV generates 3D measurements of the surrounding plants by aggregating its 2D Lidar measurements

RkJQdWJsaXNoZXIy MjI2Mzk4