Detalhes
Nome
António ValenteCargo
Investigador SéniorDesde
01 junho 2012
Nacionalidade
PortugalCentro
Centro de Robótica Industrial e Sistemas InteligentesContactos
+351220413317
antonio.valente@inesctec.pt
2025
Autores
Oliveira, F; Tinoco, V; Valente, A; Pinho, T; Cunha, JB; Santos, N;
Publicação
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
Pruning consists on an agricultural trimming procedure that is crucial in some species of plants to promote healthy growth and increased yield. Generally, this task is done through manual labour, which is costly, physically demanding, and potentially dangerous for the worker. Robotic pruning is an automated alternative approach to manual labour on this task. This approach focuses on selective pruning and requires the existence of an end-effector capable of detecting and cutting the correct point on the branch to achieve efficient pruning. This paper reviews and analyses different end-effectors used in robotic pruning, which helped to understand the advantages and limitations of the different techniques used and, subsequently, clarified the work required to enable autonomous pruning. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
2024
Autores
Ribeiro J.; Pinheiro R.; Soares S.; Valente A.; Amorim V.; Filipe V.;
Publicação
Lecture Notes in Mechanical Engineering
Abstract
The manual monitoring of refilling stations in industrial environments can lead to inefficiencies and errors, which can impact the overall performance of the production line. In this paper, we present an unsupervised detection pipeline for identifying refilling stations in industrial environments. The proposed pipeline uses a combination of image processing, pattern recognition, and deep learning techniques to detect refilling stations in visual data. We evaluate our method on a set of industrial images, and the findings demonstrate that the pipeline is reliable at detecting refilling stations. Furthermore, the proposed pipeline can automate the monitoring of refilling stations, eliminating the need for manual monitoring and thus improving industrial operations’ efficiency and responsiveness. This method is a versatile solution that can be applied to different industrial contexts without the need for labeled data or prior knowledge about the location of refilling stations.
2024
Autores
Sarmento, J; dos Santos, FN; Aguiar, AS; Filipe, V; Valente, A;
Publicação
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS
Abstract
Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.
2024
Autores
Berger, GS; Mendes, J; Chellal, AA; Bonzatto, L; da Silva, YMR; Zorawski, M; Pereira, AI; Pinto, MF; Castro, J; Valente, A; Lima, J;
Publicação
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, PT I, OL2A 2023
Abstract
This paper presents an approach to address the challenges of manual inspection using multirotor Unmanned Aerial Vehicles (UAV) to detect olive tree flies (Bactrocera oleae). The study employs computer vision techniques based on the You Only Look Once (YOLO) algorithm to detect insects trapped in yellow chromotropic traps. Therefore, this research evaluates the performance of the YOLOv7 algorithm in detecting and quantify olive tree flies using images obtained from two different digital cameras in a controlled environment at different distances and angles. The findings could potentially contribute to the automation of insect pest inspection by UAV-based robotic systems and highlight potential avenues for future advances in this field. In view of the experiments conducted indoors, it was found that the Arducam IMX477 camera acquires images with greater clarity compared to the TelloCam, making it possible to correctly highlight the set of Bactrocera oleae in different prediction models. The presented results in this research demonstrate that with the introduction of data augmentation and auto label techniques on the set of images of Bactrocera oleae, it was possible to arrive at a prediction model whose average detection was 256 Bactrocera oleae in relation to the corresponding ground truth value to 270 Bactrocera oleae.
2024
Autores
Kalbermatter, RB; Franco, T; Pereira, AI; Valente, A; Soares, SP; Lima, J;
Publicação
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, PT I, OL2A 2023
Abstract
People are living longer, promoting new challenges in healthcare. Many older adults prefer to age in their own homes rather than in healthcare institutions. Portugal has seen a similar trend, and public and private home care solutions have been developed. However, age-related pathologies can affect an elderly person's ability to perform daily tasks independently. Ambient Assisted Living (AAL) is a domain that uses information and communication technologies to improve the quality of life of older adults. AI-based fall detection systems have been integrated into AAL studies, and posture estimation tools are important for monitoring patients. In this study, the OpenCV and the YOLOv7 machine learning framework are used to develop a fall detection system based on posture analysis. To protect patient privacy, the use of a thermal camera is proposed to prevent facial recognition. The developed system was applied and validated in the real scenario.
Teses supervisionadas
2023
Autor
Luís Miguel Sampaio Sanches Ferreira
Instituição
UTAD
2022
Autor
Afonso Magalhães Mota
Instituição
UTAD
2022
Autor
João Bastos Pintor
Instituição
UTAD
2022
Autor
Luís Carlos Feliz dos Santos
Instituição
UTAD
2021
Autor
Luís Carlos Feliz Santos
Instituição
UTAD
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.