Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by José Miguel Almeida

2024

UAV Visual and Thermographic Power Line Detection Using Deep Learning

Authors
Santos, T; Cunha, T; Dias, A; Moreira, AP; Almeida, J;

Publication
SENSORS

Abstract
Inspecting and maintaining power lines is essential for ensuring the safety, reliability, and efficiency of electrical infrastructure. This process involves regular assessment to identify hazards such as damaged wires, corrosion, or vegetation encroachment, followed by timely maintenance to prevent accidents and power outages. By conducting routine inspections and maintenance, utilities can comply with regulations, enhance operational efficiency, and extend the lifespan of power lines and equipment. Unmanned Aerial Vehicles (UAVs) can play a relevant role in this process by increasing efficiency through rapid coverage of large areas and access to difficult-to-reach locations, enhanced safety by minimizing risks to personnel in hazardous environments, and cost-effectiveness compared to traditional methods. UAVs equipped with sensors such as visual and thermographic cameras enable the accurate collection of high-resolution data, facilitating early detection of defects and other potential issues. To ensure the safety of the autonomous inspection process, UAVs must be capable of performing onboard processing, particularly for detection of power lines and obstacles. In this paper, we address the development of a deep learning approach with YOLOv8 for power line detection based on visual and thermographic images. The developed solution was validated with a UAV during a power line inspection mission, obtaining mAP@0.5 results of over 90.5% on visible images and over 96.9% on thermographic images.

2024

LiDAR-Based Unmanned Aerial Vehicle Offshore Wind Blade Inspection and Modeling

Authors
Oliveira, A; Dias, A; Santos, T; Rodrigues, P; Martins, A; Almeida, J;

Publication
Drones

Abstract
The deployment of offshore wind turbines (WTs) has emerged as a pivotal strategy in the transition to renewable energy, offering significant potential for clean electricity generation. However, these structures’ operation and maintenance (O&M) present unique challenges due to their remote locations and harsh marine environments. For these reasons, it is fundamental to promote the development of autonomous solutions to monitor the health condition of the construction parts, preventing structural damage and accidents. This paper explores the application of Unmanned Aerial Vehicles (UAVs) in the inspection and maintenance of offshore wind turbines, introducing a new strategy for autonomous wind turbine inspection and a simulation environment for testing and training autonomous inspection techniques under a more realistic offshore scenario. Instead of relying on visual information to detect the WT parts during the inspection, this method proposes a three-dimensional (3D) light detection and ranging (LiDAR) method that estimates the wind turbine pose (position, orientation, and blade configuration) and autonomously controls the UAV for a close inspection maneuver. The first tests were carried out mainly in a simulation framework, combining different WT poses, including different orientations, blade positions, and wind turbine movements, and finally, a mixed reality test, where a real vehicle performed a full inspection of a virtual wind turbine.

2024

Robotic Data Recovery from Seabed with Optical High-Bandwidth Communication from a Deep-Sea Lander

Authors
Almeida, J; Soares, E; Almeida, C; Matias, B; Pereira, R; Sytnyk, D; Silva, P; Ferreira, A; Machado, D; Martins, P; Martins, A;

Publication
Oceans Conference Record (IEEE)

Abstract
This paper addresses the problem of high-bandwidth communication and data recovery from deep-sea semi-permanent robotic landers. These vehicles are suitable for long-term monitoring of underwater activities and to support the operation of other robotic assets in Operation & Maintenance (O&M) of offshore renewables. Limitations of current commu-nication solutions underwater deny the immediate transmission of the collected data to the surface, which is alternatively stored locally inside each lander. Therefore, data recovery often implies the interruption of the designated tasks so that the vehicle can return to the surface and transmit the collected data. Resorting to a short-range and high-bandwidth optical link, an alternative underwater strategy for flexible data exchange is presented. It involves the usage of an AUV satellite approaching each underwater node until an optical communication channel is established. At this point, high-bandwidth communication with the remote lander becomes available, offering the possibility to perform a variety of operations, including the download of previously recorded information, the visualisation of video streams from the lander on-board cameras, or even performing remote motion control of the lander. All these three operations were tested and validated with the experimental setup reported here. The experiments were performed in the Atlantic Ocean, at Setúbal underwater canyon, reaching the operation depth of 350m meters. Two autonomous robotic platforms were used in the experiments, namely the TURTLE3 lander and the EVA Hybrid Autonomous Underwater Vehicle. Since EVA kept a tether fibre optic connection to the Mar Profundo support vessel, it was possible to establish a full communication chain between a land-based control centre and the remote underwater nodes. © 2024 IEEE.

2024

Acoustic Imaging Learning-Based Approaches for Marine Litter Detection and Classification

Authors
Guedes, PA; Silva, HM; Wang, S; Martins, A; Almeida, J; Silva, E;

Publication
Journal of Marine Science and Engineering

Abstract
This paper introduces an advanced acoustic imaging system leveraging multibeam water column data at various frequencies to detect and classify marine litter. This study encompasses (i) the acquisition of test tank data for diverse types of marine litter at multiple acoustic frequencies; (ii) the creation of a comprehensive acoustic image dataset with meticulous labelling and formatting; (iii) the implementation of sophisticated classification algorithms, namely support vector machine (SVM) and convolutional neural network (CNN), alongside cutting-edge detection algorithms based on transfer learning, including single-shot multibox detector (SSD) and You Only Look once (YOLO), specifically YOLOv8. The findings reveal discrimination between different classes of marine litter across the implemented algorithms for both detection and classification. Furthermore, cross-frequency studies were conducted to assess model generalisation, evaluating the performance of models trained on one acoustic frequency when tested with acoustic images based on different frequencies. This approach underscores the potential of multibeam data in the detection and classification of marine litter in the water column, paving the way for developing novel research methods in real-life environments.

  • 19
  • 19