DeepPhenoTree – Apple Edition: a Multi-site apple phenology RGB annotated dataset with deep learning baseline models
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
In machine learning–driven plant phenotyping, well-annotated image datasets are essential for developing robust models capable of capturing phenological variability across environments. Here, we introduce DeepPhenoTree – Apple Edition, a multi-site, multi-variety RGB image dataset dedicated to the detection of key phenological stages in apple trees. The dataset comprises 48,320 time-stamped RGB images acquired across four European orchards of the Apple REFPOP consortium under contrasting climatic conditions. From this large corpus, a carefully curated subset of 808 representative images was manually annotated. It includes 241,600 expert annotations covering developmental stages from dormant bud to fruit maturity. Images were acquired using a standardized tractor-mounted phenotyping platform equipped with active flash illumination, ensuring consistent lighting conditions across sites and acquisition dates. Phenological structures were annotated following the BBCH scale, with bounding boxes adapted to organ visibility and developmental stage. In addition to the dataset, we provide baseline deep learning experiments to illustrate detection performance and assess model generalization across locations.