A Multi-Modal Dataset for Automated Phenological Stage Mapping in Actinidia chinensis

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Phenological monitoring of Actinidia chinensis is critical for optimising operational costs and yield prediction. However, current manual assessment methods are time-consuming, making them impractical for large-scale precision agriculture applications. Most existing phenological datasets focus exclusively on image data without spatial validation. The Multi-Modal Actinidia chinensis Phenology Dataset is composed of (i) 1 665 annotated images of phenological stages from bud to fruit set and (ii) georeferenced videos with systematic manual ground truth of spatial stage distributions. The dataset employs an adapted 17-class BBCH system that consolidates visually similar stages, excludes problematic categories, and introduces generic structural classes to address practical annotation difficulties. Additionally, the data is organised hierarchically across various plant structures, genders, and phenological stages. The annotated images offer versatility for a range of applications, including training data for computer vision models to detect phenological stages. Furthermore, the georeferenced videos facilitate the validation of automated counting algorithms. This combined approach enables plant-level detection accuracy and provides an illustrative methodology for spatial validation that users can extend to additional orchards, promoting the development and benchmarking of automated phenological monitoring systems for precision agriculture applications in kiwifruit production.

Article activity feed