Agricultural multi-month dataset with overlapping paths for mapping and localisation algorithms for autonomous robots
In March 2022, we started conducting the long-term data acquisition campaign at Ktima Gerovassiliou vineyard. This vineyard extends for more than 100ha located on the outskirts of Epanomi, Greece. Additionally, in July 2023, as an extension of our data collection efforts at Ktima Gerovassiliou, we conducted another data acquisition campaign at our vineyard located at Riseholme campus of the University of Lincoln, UK. As shown in the image below (taken in March and June at Ktima Gerovassiliou), agricultural environments present seasonal changes, repetitive structures, uneven terrain and different weather conditions, which make achieving long-term autonomy for robots a challenging problem.
Motivated by these challenging conditions and the lack of an agricultural dataset in the literature, we present the BLT dataset. Its primary objective is to push developments and evaluations of different mapping and localisation algorithms for long-term autonomous robots operating in agricultural fields. However, we believe that thanks to its temporal aspect, the dataset can also be used for phenotyping and crop mapping tasks.
Campaign
Ktima Gerovassiliou vineyard
For data collection purposes, we used 5 adjacent vineyard corridors close to the garage where the robot is usually stored (see the data collection area marked in yellow in the image below).
For each session recorded, the robot traverses autonomously a human-designed topological path (the same path for all the sessions) along the edges connecting intermediates nodes. Taking the figure below as a reference, the path designed is:
B → G → F → A → B → G → H → C → D → I → J → E → J → I → D → C → H → G → B.
This path makes first a full loop on the first 2 corridors enabling to test loop closure detections and then it proceeds to cover each corridor twice, once in each direction so that the side-mounted RGB-D camera can see all the vine row canopies from both sides.
The length of this entire path is approximately 500 m, and it takes, on average, 25 min for the robot to complete it. The average robot speed during the traversal of a corridor is 0.6 m/s, while the robot speed is significantly lower for performing a row change operation.
Riseholme Vineyard
For the Riseholme dataset, the recording was performed by manually teleoperating the robot in similar paths across the 5 sessions of recording, the vineyard is shown in the image below.
Vineyard at Riseholme campus at University of Lincoln, Lincoln, UK. The traversed paths for each session are as follows:
- 17th July: B → G → H → C → D → I → J → E
- 25th July: B → G → H → C → D → I → J → E
- 26th July: B → G → F → A → B → G → H → C → D → I → J → E
- 25th July: B → G → F → A → B → G → H → C
- 25th July: B → G → F → A → B → G → H → C → D → I → J → E
Robot platform
The robot employed for the data collection in both campaigns is a customised SAGA Robotics Thorvald II platform. We use a four-wheel drive and steer (4WD4S) wheel setup with a base length and width configuration of 1.5 and 1 m respectively.
The robot is equipped with a multi-modal sensor suite. The following table and image define and illustrate their specifications and location respectively.
Data stored
The objective was to collect data covering the whole season in order to capture the entire crop’s growth. So far we have a total of 10 sessions recorded in 2022 at Ktima Gerovassiliou and 5 sessions recorded in 2023 at Riseholme campus. The data was collected on the following dates:
Ktima Gerovassiliou (2022):
23rd March
6th and 20th April
6th and 18th May
1st, 8th, 22nd and 29th June
13th July
15th September
Riseholme (2023):
17th*, 25th*, 26th and 31st July
7th August
*No RGBD data, only LiDAR
The data recording is happening at different time intervals to accommodate a variable speed in the crop’s growth stages, e.g., only one session has been performed in March at Ktima Gerovassiliou, while four sessions (one week apart from another) have been completed in June. This is because the plants grow very slowly during winter, not offering significant visual changes between consecutive weeks. On the other hand, in the warmest months of the year, the plants drastically change their aspect a few days apart.
Moreover, the Riseholme dataset serves a valuable purpose in assessing the generalization capabilities of algorithms. For example, when training a deep learning model at Ktima Gerovassiliou, the performance of the model can be rigorously evaluated using the Riseholme dataset.
The data was stored using ROSbags, one for each session. Each rosbag contains information coming from the sensors as well as navigation data. The full list of ROS Topics that can be found in the ROSbags is defined in the following tables for both campaigns.
Ktima Gerovassiliou
Riseholme:
*Not available for 17th and 25th of July
Data examples
Different crops’ growth stage from March to June, seen from the front RGB-D camera video stream
Pointcloud maps were built using the 3d-lidar and the FAST-LIO algorithm from March to June. The points associated with the ground have been removed to make visualisation clearer. It is possible to see how the point clouds in March and April look almost the same due to the fact that the plants are still in a dormant stage, while in May and June, the canopy starts to get denser.
Data evaluation
Data access
Please fill out the form below to be sent a link with access to the entire data set. We will only use you name and email to keep track of the use of our data sets. We will not share this information and will only summarise it collectively in terms of statistics.
Reference
@article{blt2022dataset, title={BLT Dataset: acquisition of the agricultural Bacchus Long-Term Dataset with automated robot deployment}, author={Polvara, Riccardo and Molina, Sergi and Hroob, Ibrahim and Papadimitriou, Alexios and Tsiolis Konstantinos and Giakoumis, Dimitrios and Likothanassis, Spiridon and Tzovaras, Dimitrios and Cielniak, Grzegorz and Hanheide, Marc}, journal={Journal of Field Robotics, Agricultural Robots for Ag 4.0} }