Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por CRAS

2024

UAV Shore-to-Ship Parcel Delivery: Gust-Aware Trajectory Planning

Autores
Pensado, E; López, F; Jorge, H; Pinto, A;

Publicação
IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS

Abstract
This article presents a real-time trajectory optimizer for shore-to-ship operations using Unmanned Aerial Vehicles (UAVs). This concept aims to improve the efficiency of the transportation system by using UAVs to carry out parcel deliveries to offshore ships. During these operations, UAVs would fly relatively close to manned vessels, posing significant risks to the crew in the event of any incident. Additionally, in these areas, UAVs are exposed to meteorological phenomena such as wind gusts, which may compromise the stability of the flight and lead to potential collisions. Furthermore, this is a phenomenon difficult to predict, which poses a risk that must be considered in the operations. For these reasons, this work proposes a gust-aware multi-objective optimization solution for calculating fast and safe trajectories, considering the risk of flying in areas prone to the formation of intense gusts. Moreover, the system establishes a risk buffer with respect to all vessels to ensure compliance with EASA (European Union Aviation Safety Agency) regulations. For this purpose, Automatic Identification System (AIS) data are used to determine the position and velocity of the different vessels, and trajectory calculations are periodically updated based on their motion. The system computes the minimum-cost trajectory between the ground base and a moving destination ship while keeping these risk buffer constraints. The problem was solved through an Optimal Control formulation discretized on a dynamic graph with time-dependent costs and constraints. The solution was obtained using a Reaching Method that allowed efficient and real-time computations.

2024

Wave-motion compensation for USV–UAV cooperation: A model predictive controller approach

Autores
Martins, J; Pereira, P; Campilho, R; Pinto, A;

Publicação
MESA 2024 - 20th International Conference on Mechatronic, Embedded Systems and Applications, Proceedings

Abstract
Due to the difficult access to the maritime environment, cooperation between different robotic platforms operating in different domains provides numerous advantages when considering Operations and Maintenance (O&M) missions. The nest Uncrewed Surface Vehicle (USV) is equipped with a parallel platform, serving as a landing pad for Uncrewed Aerial Vehicle (UAV) landings in dynamic sea states. This work proposes a methodology for short term forecasting of wave-behaviour using Fast Fourier Transforms (FFT) and a low-pass Butterworth filter to filter out noise readings from the Inertial Measurement Unit (IMU) and applying an Auto-Regressive (AR) model for the forecast, showing good results within an almost 10-second window. These predictions are then used in a Model Predictive Control (MPC) approach to optimize trajectory planning of the landing pad roll and pitch, in order to increase horizontality, consistently mitigating around 80% of the wave induced motion. ©2024 IEEE.

2024

A Multimodal Learning-based Approach for Autonomous Landing of UAV

Autores
Neves, S; Branco, M; Pereira, I; Claro, M; Pinto, M;

Publicação
MESA 2024 - 20th International Conference on Mechatronic, Embedded Systems and Applications, Proceedings

Abstract
In the field of autonomous Unmanned Aerial Vehicles (UAVs) landing, conventional approaches fall short in delivering not only the required precision but also the resilience against environmental disturbances. Yet, learning-based algorithms can offer promising solutions by leveraging their ability to learn the intelligent behaviour from data. On one hand, this paper introduces a novel multimodal transformer-based Deep Learning detector, that can provide reliable positioning for precise autonomous landing. It surpasses standard approaches by addressing individual sensor limitations, achieving high reliability even in diverse weather and sensor failure conditions. It was rigorously validated across varying environments, achieving optimal true positive rates and average precisions of up to 90%. On the other hand, it is proposed a Reinforcement Learning (RL) decision-making model, based on a Deep Q-Network (DQN) rationale. Initially trained in simulation, its adaptive behaviour is successfully transferred and validated in a real outdoor scenario. Furthermore, this approach demonstrates rapid inference times of approximately 5ms, validating its applicability on edge devices. ©2024 IEEE.

2024

A Multimodal Perception System for Precise Landing of UAVs in Offshore Environments

Autores
Claro, RM; Neves, FSP; Pinto, AMG;

Publicação

Abstract
The integration of precise landing capabilities into UAVs is crucial for enabling autonomous operations, particularly in challenging environments such as the offshore scenarios. This work proposes a heterogeneous perception system that incorporates a multimodal fiducial marker, designed to improve the accuracy and robustness of autonomous landing of UAVs in both daytime and nighttime operations. This work presents ViTAL-TAPE, a visual transformer-based model, that enhance the detection reliability of the landing target and overcomes the changes in the illumination conditions and viewpoint positions, where traditional methods fail. VITAL-TAPE is an end-to-end model that combines multimodal perceptual information, including photometric and radiometric data, to detect landing targets defined by a fiducial marker with 6 degrees-of-freedom. Extensive experiments have proved the ability of VITAL-TAPE to detect fiducial markers with an error of 0.01 m. Moreover, experiments using the RAVEN UAV, designed to endure the challenging weather conditions of offshore scenarios, demonstrated that the autonomous landing technology proposed in this work achieved an accuracy up to 0.1 m. This research also presents the first successful autonomous operation of a UAV in a commercial offshore wind farm with floating foundations installed in the Atlantic Ocean. These experiments showcased the system’s accuracy, resilience and robustness, resulting in a precise landing technology that extends mission capabilities of UAVs, enabling autonomous and Beyond Visual Line of Sight offshore operations.

2024

TEFu-Net: A time-aware late fusion architecture for robust multi-modal ego-motion estimation

Autores
Agostinho, L; Pereira, D; Hiolle, A; Pinto, A;

Publicação
ROBOTICS AND AUTONOMOUS SYSTEMS

Abstract
Ego -motion estimation plays a critical role in autonomous driving systems by providing accurate and timely information about the vehicle's position and orientation. To achieve high levels of accuracy and robustness, it is essential to leverage a range of sensor modalities to account for highly dynamic and diverse scenes, and consequent sensor limitations. In this work, we introduce TEFu-Net, a Deep -Learning -based late fusion architecture that combines multiple ego -motion estimates from diverse data modalities, including stereo RGB, LiDAR point clouds and GNSS/IMU measurements. Our approach is non -parametric and scalable, making it adaptable to different sensor set configurations. By leveraging a Long Short -Term Memory (LSTM), TEFu-Net produces reliable and robust spatiotemporal ego -motion estimates. This capability allows it to filter out erroneous input measurements, ensuring the accuracy of the car's motion calculations over time. Extensive experiments show an average accuracy increase of 63% over TEFu-Net's input estimators and on par results with the state-of-the-art in real -world driving scenarios. We also demonstrate that our solution can achieve accurate estimates under sensor or input failure. Therefore, TEFu-Net enhances the accuracy and robustness of ego -motion estimation in real -world driving scenarios, particularly in challenging conditions such as cluttered environments, tunnels, dense vegetation, and unstructured scenes. As a result of these enhancements, it bolsters the reliability of autonomous driving functions.

2024

UAV Visual and Thermographic Power Line Detection Using Deep Learning

Autores
Santos, T; Cunha, T; Dias, A; Moreira, AP; Almeida, J;

Publicação
SENSORS

Abstract
Inspecting and maintaining power lines is essential for ensuring the safety, reliability, and efficiency of electrical infrastructure. This process involves regular assessment to identify hazards such as damaged wires, corrosion, or vegetation encroachment, followed by timely maintenance to prevent accidents and power outages. By conducting routine inspections and maintenance, utilities can comply with regulations, enhance operational efficiency, and extend the lifespan of power lines and equipment. Unmanned Aerial Vehicles (UAVs) can play a relevant role in this process by increasing efficiency through rapid coverage of large areas and access to difficult-to-reach locations, enhanced safety by minimizing risks to personnel in hazardous environments, and cost-effectiveness compared to traditional methods. UAVs equipped with sensors such as visual and thermographic cameras enable the accurate collection of high-resolution data, facilitating early detection of defects and other potential issues. To ensure the safety of the autonomous inspection process, UAVs must be capable of performing onboard processing, particularly for detection of power lines and obstacles. In this paper, we address the development of a deep learning approach with YOLOv8 for power line detection based on visual and thermographic images. The developed solution was validated with a UAV during a power line inspection mission, obtaining mAP@0.5 results of over 90.5% on visible images and over 96.9% on thermographic images.

  • 4
  • 175