2022
Autores
Pereira, PNDAD; Campilho, RDSG; Pinto, AMG;
Publicação
MACHINES
Abstract
A major effort is put into the production of green energy as a countermeasure to climatic changes and sustainability. Thus, the energy industry is currently betting on offshore wind energy, using wind turbines with fixed and floating platforms. This technology can benefit greatly from interventive autonomous underwater vehicles (AUVs) to assist in the maintenance and control of underwater structures. A wireless charger system can extend the time the AUV remains underwater, by allowing it to charge its batteries through a docking station. The present work details the development process of a housing component for a wireless charging system to be implemented in an AUV, addressed as wireless charger housing (WCH), from the concept stage to the final physical verification and operation stage. The wireless charger system prepared in this research aims to improve the longevity of the vehicle mission, without having to return to the surface, by enabling battery charging at a docking station. This product was designed following a design for excellence (DfX) and modular design philosophy, implementing visual scorecards to measure the success of certain design aspects. For an adequate choice of materials, the Ashby method was implemented. The structural performance of the prototypes was validated via a linear static finite element analysis (FEA). These prototypes were further physically verified in a hyperbaric chamber. Results showed that the application of FEA, together with well-defined design goals, enable the WCH optimisation while ensuring up to 75% power efficiency. This methodology produced a system capable of transmitting energy for underwater robotic applications.
2022
Autores
Agostinho, LR; Ricardo, NM; Pereira, MI; Hiolle, A; Pinto, AM;
Publicação
IEEE ACCESS
Abstract
The expansion of autonomous driving operations requires the research and development of accurate and reliable self-localization approaches. These include visual odometry methods, in which accuracy is potentially superior to GNSS-based techniques while also working in signal-denied areas. This paper presents an in-depth review of state-of-the-art visual and point cloud odometry methods, along with a direct performance comparison of some of these techniques in the autonomous driving context. The evaluated methods include camera, LiDAR, and multi-modal approaches, featuring knowledge and learning-based algorithms, which are compared from a common perspective. This set is subject to a series of tests on road driving public datasets, from which the performance of these techniques is benchmarked and quantitatively measured. Furthermore, we closely discuss their effectiveness against challenging conditions such as pronounced lighting variations, open spaces, and the presence of dynamic objects in the scene. The research demonstrates increased accuracy in point cloud-based methods by surpassing visual techniques by roughly 33.14% in trajectory error. This survey also identifies a performance stagnation in state-of-the-art methodologies, especially in complex conditions. We also examine how multi-modal architectures can circumvent individual sensor limitations. This aligns with the benchmarking results, where the multi-modal algorithms exhibit greater consistency across all scenarios, outperforming the best LiDAR method (CT-ICP) by 5.68% in translational drift. Additionally, we address how current AI advances constitute a way to overcome the current development plateau.
2022
Autores
Duarte, DF; Pereira, MI; Pinto, AM;
Publicação
MARINE TECHNOLOGY SOCIETY JOURNAL
Abstract
Recently, research concerning the navigation of autonomous surface vehicles (ASVs) has been increasing. However, a large-scale implementation of these vessels is still held back by several challenges such as multi-object tracking. Attaining accurate object detection plays a big role in achieving successful tracking. This article presents the development of a detection model with an image-based Con-volutional Neural Network trained through transfer learning, a deep learning tech-nique. To train, test, and validate the detector module, data were collected with the SENSE ASV by sailing through two nearby ports, Leixoes and Viana do Castelo, and recording video frames through its on-board cameras, along with a Light De-tection And Ranging, GPS, and Inertial Measurement Unit data. Images were ex-tracted from the collected data, composing a manually annotated dataset with nine classes of different vessels, along with data from other open-source maritime datasets. The developed model achieved a class mAP@[.5 .95] (mean average precision) of 89.5% and a clear improvement in boat detection compared to a multi-purposed state-of-the-art detector, YOLO-v4, with a 22.9% and 44.3% increase in the mAP with an Intersection over Union threshold of 50% and the mAP@[.5 .95], respectively. It was integrated in a detection and tracking system, being able to continuously detect nearby vessels and provide sufficient informa-tion for simple navigation tasks.
2022
Autores
Duarte, DF; Pereira, MI; Pinto, AM;
Publicação
Marine Technology Society Journal
Abstract
2022
Autores
Neves, F; F. Reis, M; Andrade, G; Aguiar, AP; Pinto, AM;
Publicação
Abstract
2022
Autores
Teixeira, B; Lima, AP; Pinho, C; Viegas, D; Dias, N; Silva, H; Almeida, J;
Publicação
2022 OCEANS HAMPTON ROADS
Abstract
The Feedfirst Intelligent Monitoring System is a novel tool for intelligent monitoring of fish nurseries in aquaculture scenarios, mainly focusing on monitoring three essential items: water quality control, biomass estimation, and automated feeding. The system is based on machine vision techniques for fish larvae population size detection, and larvae biomass estimation is monitored through size measurement. We also show that the perception-actuation loop in automated fish tanks can be closed by using the vision system output to influence feeding procedures. The proposed solution was tested in a real tank in an aquaculture setting with real-time performance and logging capabilities.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.