Forage management optimization based on robotics and artificial intelligence

Last changed: 30 August 2024
A robot on wheels in a field.

The aim of this project was to evaluate how Unmanned Ground Vehicles (UGVs) coupled with remote sensors could become a tool for forage management in boreal regions.

Background

Forages are the most important crops of Northern Sweden, and represent more than 70% of the total agricultural land use. The production performance of these crops affects the whole livestock system, as forages are the primary source of feed for ruminants. Helping farmers to adjust their practices based on relevant information would increase the efficiency of the dairy and meat production industries. Remote sensing offers relevant solutions for monitoring of crops and fields.

Satellite images cover a large geographical extent, at the cost of a moderate to coarse pixel size. Drones are another method for providing high resolution spatial information, however they are limited by weather conditions, restrictive regulations, and a very limited payload. Unmanned ground vehicles (UGVs) are not affected by the above-mentioned issues, and could become efficient scouting tools to help farmers in their decision-making.

Project description

A UGV pilot robot was be developed and equipped with a range of cameras and sensors and tested in field conditions to (i) report the potential and limitations of UGVs as a platform for sensors, and (ii) assess the accuracy and robustness of the various sensors to retrieve relevant agronomic information, such as clover content and biomass production.

The project was divided into two phases. In the first phase, we assembled the robot, considering its design, hover structure, and payload arrangement (see picture above). Then, in phase two, we performed field trials to explore the robot's operational potential in field conditions. 

The robot was equipped with RGB-D cameras, spectrometers, a high-precision GPS, a PC to register the RGB-D data and a compact tablet to register the spectral data, an Inertial Measurement Unit, and batteries. The RGB-D cameras come with a classic RGB sensor, complemented with a depth sensor, which provides true-colour and distance (from the sensor to target, such as the leaves in the field) images and videos, which can be used as a proxy to estimate botanical composition and biomass. Spectrometers are non-imaging sensors that collect spectral information over different regions of the electromagnetic spectrum. This spectral information can be used to assess several biophysical characteristics of the vegetation. The GPS is a navigation system that supports autonomously driving the robot and collects geo-located information. The IMU controls the robot, and batteries provide the life span for moving and acquiring data.

The robot showed potential to be used in field conditions for continuously acquiring pictures (Figure 2) that can be used to estimate botanical composition (Figure 3), as well as spectral data that can be useful to monitor forage fields, mainly harvested fields (leys).

Future approaches would involve performing field trials with the robot, sample collection to build models, and statistical analysis to evaluate its potential to support forage quality and yield estimations.

Two images of different plants growing on soil, one in colour and one in black and grey.

Figure 2. An example of a RGB (left) and a Depth (right) image acquired by the RGB-D camera mounted in the robot.

Two pictures of grass and clover growing in soil, one in green and one in blue and brown.

Figure 3. An example of a RGB image acquired by the robot and classified in grass, clover and weed using the CloverSense platform (https://cloversense.net/).

Collaboration

This project was a collaboration between the Swedish University of Agricultural Sciences (SLU) and Luleå Technical University (LTU). LTU developed and assembled the robot, and SLU analyzed the sensors and data collected.

Contacts

Julianne Oliveira, Researcher
Department of Crop Production Ecology, Crop production science
Swedish University of Agricultural Sciences
julianne.oliveira@slu.se, +46 90-786 87 24, +46 76-763 71 29

David Parsons, Professor
Department of Crop Production Ecology, Crop production science
Swedish University of Agricultural Sciences
david.parsons@slu.se, +46 90-786 87 14, +46 73-089 34 45

Jakub Haluska, Senior Research Engineer
Department of Computer Science, Electrical and Space Engineering
Luleå Technical University
jakub.haluska@ltu.se, +46 920 49 31 35

Christoforos Kanellakis, Associate Senior Lecturer
Department of Computer Science, Electrical and Space Engineering
Luleå Technical University
christoforos.kanellakis@ltu.se, +46 920 49 23 63

Ilektra Tsimpidi, PhD Student
Department of Computer Science, Electrical and Space Engineering
Luleå Technical University
ilektra.tsimpidi@ltu.se

George Nikolakopoulos, Professor
Department of Computer Science, Electrical and Space Engineering
Luleå Technical University
george.nikolakopoulos@ltu.se, +46 920 49 12 98

Related pages:


Contact

Julianne Oliveira, Researcher
Department of Crop Production Ecology, Crop production science
Swedish University of Agricultural Sciences
julianne.oliveira@slu.se, +46 907868724, +46 767637129