2024
Autores
Sarmento, J; dos Santos, FN; Aguiar, AS; Filipe, V; Valente, A;
Publicação
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS
Abstract
Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.
2024
Autores
Berger, GS; Mendes, J; Chellal, AA; Bonzatto, L; da Silva, YMR; Zorawski, M; Pereira, AI; Pinto, MF; Castro, J; Valente, A; Lima, J;
Publicação
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, PT I, OL2A 2023
Abstract
This paper presents an approach to address the challenges of manual inspection using multirotor Unmanned Aerial Vehicles (UAV) to detect olive tree flies (Bactrocera oleae). The study employs computer vision techniques based on the You Only Look Once (YOLO) algorithm to detect insects trapped in yellow chromotropic traps. Therefore, this research evaluates the performance of the YOLOv7 algorithm in detecting and quantify olive tree flies using images obtained from two different digital cameras in a controlled environment at different distances and angles. The findings could potentially contribute to the automation of insect pest inspection by UAV-based robotic systems and highlight potential avenues for future advances in this field. In view of the experiments conducted indoors, it was found that the Arducam IMX477 camera acquires images with greater clarity compared to the TelloCam, making it possible to correctly highlight the set of Bactrocera oleae in different prediction models. The presented results in this research demonstrate that with the introduction of data augmentation and auto label techniques on the set of images of Bactrocera oleae, it was possible to arrive at a prediction model whose average detection was 256 Bactrocera oleae in relation to the corresponding ground truth value to 270 Bactrocera oleae.
2024
Autores
Kalbermatter, RB; Franco, T; Pereira, AI; Valente, A; Soares, SP; Lima, J;
Publicação
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, PT I, OL2A 2023
Abstract
People are living longer, promoting new challenges in healthcare. Many older adults prefer to age in their own homes rather than in healthcare institutions. Portugal has seen a similar trend, and public and private home care solutions have been developed. However, age-related pathologies can affect an elderly person's ability to perform daily tasks independently. Ambient Assisted Living (AAL) is a domain that uses information and communication technologies to improve the quality of life of older adults. AI-based fall detection systems have been integrated into AAL studies, and posture estimation tools are important for monitoring patients. In this study, the OpenCV and the YOLOv7 machine learning framework are used to develop a fall detection system based on posture analysis. To protect patient privacy, the use of a thermal camera is proposed to prevent facial recognition. The developed system was applied and validated in the real scenario.
2024
Autores
Saraiva, AA; da Silva, JPO; Moura Sousa, JV; Fonseca Ferreira, NM; Soares, SP; Valente, A;
Publicação
Proceedings of the 17th International Joint Conference on Biomedical Engineering Systems and Technologies, BIOSTEC 2024, Volume 1, Rome, Italy, February 21-23, 2024.
Abstract
2024
Autores
Neves, BP; Santos, VDN; Valente, A;
Publicação
ELECTRONICS
Abstract
This article presents a new firmware update paradigm for optimising the procedure in microcontrollers. The aim is to allow updating during program execution, without interruptions or restarts, replacing only specific code segments. The proposed method uses static and absolute addresses to locate and isolate the code segment to be updated. The work focuses on Microchip's PIC18F27K42 microcontroller and includes an example of updating functionality without affecting ongoing applications. This approach is ideal for band limited channels, reducing the amount of data transmitted during the update process. It also allows incremental changes to the program code, preserving network capacity, and reduces the costs associated with data transfer, especially in firmware update scenarios using cellular networks. This ability to update the normal operation of the device, avoiding service interruption and minimising downtime, is of remarkable value.
2024
Autores
Rebelo, PM; Valente, A; Oliveira, PM; Sobreira, H; Costa, P;
Publicação
ROBOT 2023: SIXTH IBERIAN ROBOTICS CONFERENCE ADVANCES IN ROBOTICS, VOL 1
Abstract
Mobile robot platforms capable of operating safely and accurately in dynamic environments can have a multitude of applications, ranging from simple delivery tasks to advanced assembly operations. These abilities rely heavily on a robust navigation stack, which requires stable and accurate pose estimations within the environment. The wide range of AMR's applications and the characteristics of multiple industrial environments (indoor and outdoor) have led to the development of a flexible and robust robot software architecture that allows the fusion of different data sensors in real time. In this way, and in terms of localization, AMRs have greater precision when it comes to uncontrolled and unstructured environments. These complex environments feature a variety of dynamic and unpredictable elements, such as variable layouts, limited visibility, unstructured spaces, and uncertain terrain. This paper presents a multi-localization system for industrial mobile robots in complex and dynamic industrial scenarios, based on different localization technologies and methods that can interact together and simultaneously.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.