Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por CRAS

2024

TEFu-Net: A time-aware late fusion architecture for robust multi-modal ego-motion estimation

Autores
Agostinho, L; Pereira, D; Hiolle, A; Pinto, A;

Publicação
ROBOTICS AND AUTONOMOUS SYSTEMS

Abstract
Ego -motion estimation plays a critical role in autonomous driving systems by providing accurate and timely information about the vehicle's position and orientation. To achieve high levels of accuracy and robustness, it is essential to leverage a range of sensor modalities to account for highly dynamic and diverse scenes, and consequent sensor limitations. In this work, we introduce TEFu-Net, a Deep -Learning -based late fusion architecture that combines multiple ego -motion estimates from diverse data modalities, including stereo RGB, LiDAR point clouds and GNSS/IMU measurements. Our approach is non -parametric and scalable, making it adaptable to different sensor set configurations. By leveraging a Long Short -Term Memory (LSTM), TEFu-Net produces reliable and robust spatiotemporal ego -motion estimates. This capability allows it to filter out erroneous input measurements, ensuring the accuracy of the car's motion calculations over time. Extensive experiments show an average accuracy increase of 63% over TEFu-Net's input estimators and on par results with the state-of-the-art in real -world driving scenarios. We also demonstrate that our solution can achieve accurate estimates under sensor or input failure. Therefore, TEFu-Net enhances the accuracy and robustness of ego -motion estimation in real -world driving scenarios, particularly in challenging conditions such as cluttered environments, tunnels, dense vegetation, and unstructured scenes. As a result of these enhancements, it bolsters the reliability of autonomous driving functions.

2024

UAV Visual and Thermographic Power Line Detection Using Deep Learning

Autores
Santos, T; Cunha, T; Dias, A; Moreira, AP; Almeida, J;

Publicação
SENSORS

Abstract
Inspecting and maintaining power lines is essential for ensuring the safety, reliability, and efficiency of electrical infrastructure. This process involves regular assessment to identify hazards such as damaged wires, corrosion, or vegetation encroachment, followed by timely maintenance to prevent accidents and power outages. By conducting routine inspections and maintenance, utilities can comply with regulations, enhance operational efficiency, and extend the lifespan of power lines and equipment. Unmanned Aerial Vehicles (UAVs) can play a relevant role in this process by increasing efficiency through rapid coverage of large areas and access to difficult-to-reach locations, enhanced safety by minimizing risks to personnel in hazardous environments, and cost-effectiveness compared to traditional methods. UAVs equipped with sensors such as visual and thermographic cameras enable the accurate collection of high-resolution data, facilitating early detection of defects and other potential issues. To ensure the safety of the autonomous inspection process, UAVs must be capable of performing onboard processing, particularly for detection of power lines and obstacles. In this paper, we address the development of a deep learning approach with YOLOv8 for power line detection based on visual and thermographic images. The developed solution was validated with a UAV during a power line inspection mission, obtaining mAP@0.5 results of over 90.5% on visible images and over 96.9% on thermographic images.

2024

LiDAR-Based Unmanned Aerial Vehicle Offshore Wind Blade Inspection and Modeling

Autores
Oliveira, A; Dias, A; Santos, T; Rodrigues, P; Martins, A; Almeida, J;

Publicação
Drones

Abstract
The deployment of offshore wind turbines (WTs) has emerged as a pivotal strategy in the transition to renewable energy, offering significant potential for clean electricity generation. However, these structures’ operation and maintenance (O&M) present unique challenges due to their remote locations and harsh marine environments. For these reasons, it is fundamental to promote the development of autonomous solutions to monitor the health condition of the construction parts, preventing structural damage and accidents. This paper explores the application of Unmanned Aerial Vehicles (UAVs) in the inspection and maintenance of offshore wind turbines, introducing a new strategy for autonomous wind turbine inspection and a simulation environment for testing and training autonomous inspection techniques under a more realistic offshore scenario. Instead of relying on visual information to detect the WT parts during the inspection, this method proposes a three-dimensional (3D) light detection and ranging (LiDAR) method that estimates the wind turbine pose (position, orientation, and blade configuration) and autonomously controls the UAV for a close inspection maneuver. The first tests were carried out mainly in a simulation framework, combining different WT poses, including different orientations, blade positions, and wind turbine movements, and finally, a mixed reality test, where a real vehicle performed a full inspection of a virtual wind turbine.

2024

Robotic Data Recovery from Seabed with Optical High-Bandwidth Communication from a Deep-Sea Lander

Autores
Almeida, J; Soares, E; Almeida, C; Matias, B; Pereira, R; Sytnyk, D; Silva, P; Ferreira, A; Machado, D; Martins, P; Martins, A;

Publicação
Oceans Conference Record (IEEE)

Abstract
This paper addresses the problem of high-bandwidth communication and data recovery from deep-sea semi-permanent robotic landers. These vehicles are suitable for long-term monitoring of underwater activities and to support the operation of other robotic assets in Operation & Maintenance (O&M) of offshore renewables. Limitations of current commu-nication solutions underwater deny the immediate transmission of the collected data to the surface, which is alternatively stored locally inside each lander. Therefore, data recovery often implies the interruption of the designated tasks so that the vehicle can return to the surface and transmit the collected data. Resorting to a short-range and high-bandwidth optical link, an alternative underwater strategy for flexible data exchange is presented. It involves the usage of an AUV satellite approaching each underwater node until an optical communication channel is established. At this point, high-bandwidth communication with the remote lander becomes available, offering the possibility to perform a variety of operations, including the download of previously recorded information, the visualisation of video streams from the lander on-board cameras, or even performing remote motion control of the lander. All these three operations were tested and validated with the experimental setup reported here. The experiments were performed in the Atlantic Ocean, at Setúbal underwater canyon, reaching the operation depth of 350m meters. Two autonomous robotic platforms were used in the experiments, namely the TURTLE3 lander and the EVA Hybrid Autonomous Underwater Vehicle. Since EVA kept a tether fibre optic connection to the Mar Profundo support vessel, it was possible to establish a full communication chain between a land-based control centre and the remote underwater nodes. © 2024 IEEE.

2024

Man-Machine Symbiosis UAV Integration for Military Search and Rescue Operations

Autores
Minhoto, V; Santos, T; Silva, LTE; Rodrigues, P; Arrais, A; Amaral, A; Dias, A; Almeida, J; Cunha, JPS;

Publicação
ROBOT 2023: SIXTH IBERIAN ROBOTICS CONFERENCE, VOL 2

Abstract
Over the last few years, Man-Machine collaborative systems have been increasingly present in daily routines. In these systems, one operator usually controls the machine through explicit commands and assesses the information through a graphical user interface. Direct & implicit interaction between the machine and the user does not exist. This work presents a man-machine symbiotic concept & system where such implicit interaction is possible targeting search and rescue scenarios. Based on measuring physiological variables (e.g. body movement or electrocardiogram) through wearable devices, this system is capable of computing the psycho-physiological state of the human and autonomously identify abnormal situations (e.g. fall or stress). This information is injected into the control loop of the machine that can alter its behavior according to it, enabling an implicit man-machine communication mechanism. A proof of concept of this system was tested at the ARTEX (ARmy Technological EXperimentation) exercise organized by the Portuguese Army involving a military agent and a drone. During this event the soldier was equipped with a kit of wearables that could monitor several physiological variables and automatically detect a fall during a mission. This information was continuously sent to the drone that successfully identified this abnormal situation triggering the take-off and a situation awareness fly-by flight pattern, delivering a first-aid kit to the soldier in case he did not recover after a pre-determined time period. The results were very positive, proving the possibility and feasibility of a symbiotic system between humans and machines.

2024

The SAIL dataset of marine atmospheric electric field observations over the Atlantic Ocean

Autores
Barbosa, S; Dias, N; Almeida, C; Amaral, G; Ferreira, A; Camilo, A; Silva, E;

Publicação

Abstract
Abstract. A unique dataset of marine atmospheric electric field observations over the Atlantic Ocean is described. The data are relevant not only for atmospheric electricity studies, but more generally for studies of the Earth's atmosphere and climate variability, as well as space-earth interactions studies. In addition to the atmospheric electric field data, the dataset includes simultaneous measurements of other atmospheric  variables, including gamma radiation, visibility, and solar radiation. These ancillary observations not only support interpretation and understanding of the atmospheric electric field data, but are also of interest in themselves. The entire framework from data collection to final derived datasets has been duly documented to ensure traceability and reproducibility of the whole data curation chain. All the data, from raw measurements to final datasets, are preserved in data repositories with a corresponding assigned DOI. Final datasets are available from the Figshare repository (https://figshare.com/projects/SAIL_Data/178500) and computational notebooks containing the code used at every step of the data curation chain are available from the Zenodo repository (https://zenodo.org/communities/sail).

  • 4
  • 173