Details
Name
Isabel Alexandra PinheiroRole
Research AssistantSince
01st November 2022
Nationality
PortugalCentre
Robotics in Industry and Intelligent SystemsContacts
+351222094171
isabel.a.pinheiro@inesctec.pt
2025
Authors
Castro, JT; Pinheiro, I; Marques, MN; Moura, P; dos Santos, FN;
Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
In nature, and particularly in agriculture, pollination is fundamental for the sustainability of our society. In this context, pollination is a vital process underlying crop yield quality and is responsible for the biodiversity and the standards of the flora. Bees play a crucial role in natural pollination; however, their populations are declining. Robots can help maintain pollination levels while humans work to recover bee populations. Swarm robotics approaches appear promising for robotic pollination. This paper proposes the cooperation between multiple Unmanned Aerial Vehicles (UAVs) and an Unmanned Ground Vehicle (UGV), leveraging the advantages of collaborative work for pollination, referred to as Pollinationbots. Pollinationbots is based in swarm behaviors and methodologies to implement more effective pollination strategies, ensuring efficient pollination across various scenarios. The paper presents the architecture of the Pollinationbots system, which was evaluated using the Webots simulator, focusing on path planning and follower behavior. Preliminary simulation results indicate that this is a viable solution for robotic pollination. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
2024
Authors
Pinheiro, I; Moreira, G; Magalhaes, S; Valente, A; Cunha, M; dos Santos, FN;
Publication
SCIENTIFIC REPORTS
Abstract
Pollination is critical for crop development, especially those essential for subsistence. This study addresses the pollination challenges faced by Actinidia, a dioecious plant characterized by female and male flowers on separate plants. Despite the high protein content of pollen, the absence of nectar in kiwifruit flowers poses difficulties in attracting pollinators. Consequently, there is a growing interest in using artificial intelligence and robotic solutions to enable pollination even in unfavourable conditions. These robotic solutions must be able to accurately detect flowers and discern their genders for precise pollination operations. Specifically, upon identifying female Actinidia flowers, the robotic system should approach the stigma to release pollen, while male Actinidia flowers should target the anthers to collect pollen. We identified two primary research gaps: (1) the lack of gender-based flower detection methods and (2) the underutilisation of contemporary deep learning models in this domain. To address these gaps, we evaluated the performance of four pretrained models (YOLOv8, YOLOv5, RT-DETR and DETR) in detecting and determining the gender of Actinidia flowers. We outlined a comprehensive methodology and developed a dataset of manually annotated flowers categorized into two classes based on gender. Our evaluation utilised k-fold cross-validation to rigorously test model performance across diverse subsets of the dataset, addressing the limitations of conventional data splitting methods. DETR provided the most balanced overall performance, achieving precision, recall, F1 score and mAP of 89%, 97%, 93% and 94%, respectively, highlighting its robustness in managing complex detection tasks under varying conditions. These findings underscore the potential of deep learning models for effective gender-specific detection of Actinidia flowers, paving the way for advanced robotic pollination systems.
2023
Authors
Pinheiro, I; Aguiar, A; Figueiredo, A; Pinho, T; Valente, A; Santos, F;
Publication
APPLIED SCIENCES-BASEL
Abstract
Currently, Unmanned Aerial Vehicles (UAVs) are considered in the development of various applications in agriculture, which has led to the expansion of the agricultural UAV market. However, Nano Aerial Vehicles (NAVs) are still underutilised in agriculture. NAVs are characterised by a maximum wing length of 15 centimetres and a weight of fewer than 50 g. Due to their physical characteristics, NAVs have the advantage of being able to approach and perform tasks with more precision than conventional UAVs, making them suitable for precision agriculture. This work aims to contribute to an open-source solution known as Nano Aerial Bee (NAB) to enable further research and development on the use of NAVs in an agricultural context. The purpose of NAB is to mimic and assist bees in the context of pollination. We designed this open-source solution by taking into account the existing state-of-the-art solution and the requirements of pollination activities. This paper presents the relevant background and work carried out in this area by analysing papers on the topic of NAVs. The development of this prototype is rather complex given the interactions between the different hardware components and the need to achieve autonomous flight capable of pollination. We adequately describe and discuss these challenges in this work. Besides the open-source NAB solution, we train three different versions of YOLO (YOLOv5, YOLOv7, and YOLOR) on an original dataset (Flower Detection Dataset) containing 206 images of a group of eight flowers and a public dataset (TensorFlow Flower Dataset), which must be annotated (TensorFlow Flower Detection Dataset). The results of the models trained on the Flower Detection Dataset are shown to be satisfactory, with YOLOv7 and YOLOR achieving the best performance, with 98% precision, 99% recall, and 98% F1 score. The performance of these models is evaluated using the TensorFlow Flower Detection Dataset to test their robustness. The three YOLO models are also trained on the TensorFlow Flower Detection Dataset to better understand the results. In this case, YOLOR is shown to obtain the most promising results, with 84% precision, 80% recall, and 82% F1 score. The results obtained using the Flower Detection Dataset are used for NAB guidance for the detection of the relative position in an image, which defines the NAB execute command.
2023
Authors
Pinheiro, I; Moreira, G; da Silva, DQ; Magalhaes, S; Valente, A; Oliveira, PM; Cunha, M; Santos, F;
Publication
AGRONOMY-BASEL
Abstract
The world wine sector is a multi-billion dollar industry with a wide range of economic activities. Therefore, it becomes crucial to monitor the grapevine because it allows a more accurate estimation of the yield and ensures a high-quality end product. The most common way of monitoring the grapevine is through the leaves (preventive way) since the leaves first manifest biophysical lesions. However, this does not exclude the possibility of biophysical lesions manifesting in the grape berries. Thus, this work presents three pre-trained YOLO models (YOLOv5x6, YOLOv7-E6E, and YOLOR-CSP-X) to detect and classify grape bunches as healthy or damaged by the number of berries with biophysical lesions. Two datasets were created and made publicly available with original images and manual annotations to identify the complexity between detection (bunches) and classification (healthy or damaged) tasks. The datasets use the same 10,010 images with different classes. The Grapevine Bunch Detection Dataset uses the Bunch class, and The Grapevine Bunch Condition Detection Dataset uses the OptimalBunch and DamagedBunch classes. Regarding the three models trained for grape bunches detection, they obtained promising results, highlighting YOLOv7 with 77% of mAP and 94% of the F1-score. In the case of the task of detection and identification of the state of grape bunches, the three models obtained similar results, with YOLOv5 achieving the best ones with an mAP of 72% and an F1-score of 92%.
2023
Authors
Moura, P; Pinheiro, I; Terra, F; Pinho, T; Santos, F;
Publication
The 3rd International Electronic Conference on Agronomy
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.