Flourish: Aerial Data Collection and Analysis and Automated Ground Intervention for Precision Farming

Flourish (2015-2018, GA n°644227), is a European project coordinated by the Autonomous Systems Lab at ETH Zürich (Switzerland) with a focus on aerial and ground robotics for precision farming. The project abstract can be found below:

“To feed a growing world population with the given amount of available farm land, we must develop new methods of sustainable farming that increase yield while reducing reliance on herbicides and pesticides. Precision agricultural techniques seek to address this challenge by monitoring key indicators of crop health and targeting treatment only to plants that need it. This is a time consuming and expensive activity and while there has been great progress on autonomous farm robots, most systems have been developed to solve only specialized tasks. This lack of flexibility poses a high risk of no return on investment for farmers.

The goal of the Flourish project is to bridge the gap between the current and desired capabilities of agricultural robots by developing an adaptable robotic solution for precision farming. By combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle, the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention. The system can be adapted to a wide range of crops by choosing different sensors and ground treatment packages. This development requires improvements in technological abilities for safe accurate navigation within farms, coordinated multi-robot mission planning that enables large field survey even with short UAV flight times, multispectral three-dimensional mapping with high temporal and spatial resolution, ground intervention tools and techniques, data analysis tools for crop monitoring and weed detection, and user interface design to support agricultural decision making. As these aspects are addressed in Flourish, the project will unlock new prospects for commercial agricultural robotics in the near future.”

Here is a video highlighting our contribution on the weed removal and tracking:

And here is a video on our multi-steering control framework for the Bonirob:

Our main contributions on this project are:

1/ Ground intervention on the field of the UGV (a Bonirob):

The Flourish ground intervention module is a self-sustained system designed to achieve mechanical weed removal and selective spraying in on-field operation, which consists of the weed detection module, tracking module, and two treatment modules : the mechanical weed treatment one and the selective spraying one.

In this project, we developped the tracking module and integrated all other involved modules with a multiple-non-overlapping-cameras setup. Our proposed system is able to precisely track detected targets over cameras without overlapping areas and predict the triggering timing of each target for intervention. Our method is implemented and evaluated on BoniRob, and has been tested on the field : the experimental results demonstrate that our proposed system can accurately stamp or spray targets in various terrain conditions.

The tracking module:

The module is explained on the picture below: the ground is moving to the right,  the same target is shown at different times:

When a plant enters the field of view of Camera 1, it is classified as crop or weed by the classifier, the results of the classification are received just before the plant leaves the field of view of Camera 1. The plants classified as weeds (the targets) are tracked in Camera 1 using visual odometry, and the prediction of the position of the target is corrected with an observation and a template matching.

The targets are then tracked the same way between the two Cameras, and the position of the target in the field of view of Camera 2 is predicted and corrected. Then it is predicted and corrected again in the field of view of Camera 2 to predict an accurate triggering time for the stamper (the weed removal module).

Figure 1: Generic method

Tracking results:

Right = camera 1 (red = results from classification, green = target corrected)

Left = camera 2 (yellow = inter camera prediction, blue = corrected, red = intra camera 2 prediction, green = corrected)

This method is applied to two exploration toy cases, the art gallery problem and the fortress problem, and UAV network deployment.

A few videos illustrating spraying and stamping:

Flourish Selective Spraying

3Sat – Flourish Stamping

2/ Coordinated UAV and UGV operations: The Flourish project is based on three concepts: the UAV operations, the UGV operations and the collaboration between the two of them and the farm operator. The UAV main mission is to regularly collect sensor data over the field over the growing season. The UGV has two main missions: interventions on the field, and responsibility for charging and transporting the UAV to different parts of the field using external roads or paths. The collaboration of the UAV and the UGV is essential to unlock the potential of the Flourish system. In this project, we developped the framework for both robots to collaborate, making sure they successfully achieve their individual missions while being able to rendezvous or synchronize when required. The task management framework is based upon Ros Task Manager, developped in our Lab.
For more information : http://flourish-project.eu/

© 2019 – 2023 DREAM Lab – Georgia Tech