2024
Authors
Minhoto, V; Santos, T; Silva, LTE; Rodrigues, P; Arrais, A; Amaral, A; Dias, A; Almeida, J; Cunha, JPS;
Publication
ROBOT 2023: SIXTH IBERIAN ROBOTICS CONFERENCE, VOL 2
Abstract
Over the last few years, Man-Machine collaborative systems have been increasingly present in daily routines. In these systems, one operator usually controls the machine through explicit commands and assesses the information through a graphical user interface. Direct & implicit interaction between the machine and the user does not exist. This work presents a man-machine symbiotic concept & system where such implicit interaction is possible targeting search and rescue scenarios. Based on measuring physiological variables (e.g. body movement or electrocardiogram) through wearable devices, this system is capable of computing the psycho-physiological state of the human and autonomously identify abnormal situations (e.g. fall or stress). This information is injected into the control loop of the machine that can alter its behavior according to it, enabling an implicit man-machine communication mechanism. A proof of concept of this system was tested at the ARTEX (ARmy Technological EXperimentation) exercise organized by the Portuguese Army involving a military agent and a drone. During this event the soldier was equipped with a kit of wearables that could monitor several physiological variables and automatically detect a fall during a mission. This information was continuously sent to the drone that successfully identified this abnormal situation triggering the take-off and a situation awareness fly-by flight pattern, delivering a first-aid kit to the soldier in case he did not recover after a pre-determined time period. The results were very positive, proving the possibility and feasibility of a symbiotic system between humans and machines.
2024
Authors
Morais, R; Martins, JJ; Lima, P; Dias, A; Martins, A; Almeida, J; Silva, E;
Publication
OCEANS 2024 - SINGAPORE
Abstract
Solar energy will contribute to global economic growth, increasing worldwide photovoltaic (PV) solar energy production. More recently, one of the outstanding energy achievements of the last decade has been the development of floating photovoltaic panels. These panels differ from conventional (terrestrial) panels because they occupy space in a more environmentally friendly way, i.e., aquatic areas. In contrast, land areas are saved for other applications, such as construction or agriculture. Developing autonomous inspection systems using unmanned aerial vehicles (UAVs) represents a significant step forward in solar PV technology. Given the frequently remote and difficult-to-access locations, traditional inspection methods are no longer practical or suitable. Responding to these challenges, an innovative inspection framework was developed to autonomously inspect photovoltaic plants (offshore) with a Vertical Takeoff and Landing (VTOL) UAV. This work explores two different methods of autonomous aerial inspection, each adapted to specific scenarios, thus increasing the adaptability of the inspection process. During the flight, the aerial images are evaluated in real-time for the autonomous detection of the photovoltaic modules and the detection of possible faults. This mechanism is crucial for making decisions and taking immediate corrective action. An offshore simulation environment was developed to validate the implemented system.
2024
Authors
Santos, T; Cunha, T; Dias, A; Moreira, AP; Almeida, J;
Publication
SENSORS
Abstract
Inspecting and maintaining power lines is essential for ensuring the safety, reliability, and efficiency of electrical infrastructure. This process involves regular assessment to identify hazards such as damaged wires, corrosion, or vegetation encroachment, followed by timely maintenance to prevent accidents and power outages. By conducting routine inspections and maintenance, utilities can comply with regulations, enhance operational efficiency, and extend the lifespan of power lines and equipment. Unmanned Aerial Vehicles (UAVs) can play a relevant role in this process by increasing efficiency through rapid coverage of large areas and access to difficult-to-reach locations, enhanced safety by minimizing risks to personnel in hazardous environments, and cost-effectiveness compared to traditional methods. UAVs equipped with sensors such as visual and thermographic cameras enable the accurate collection of high-resolution data, facilitating early detection of defects and other potential issues. To ensure the safety of the autonomous inspection process, UAVs must be capable of performing onboard processing, particularly for detection of power lines and obstacles. In this paper, we address the development of a deep learning approach with YOLOv8 for power line detection based on visual and thermographic images. The developed solution was validated with a UAV during a power line inspection mission, obtaining mAP@0.5 results of over 90.5% on visible images and over 96.9% on thermographic images.
2024
Authors
Oliveira, A; Dias, A; Santos, T; Rodrigues, P; Martins, A; Almeida, J;
Publication
DRONES
Abstract
The deployment of offshore wind turbines (WTs) has emerged as a pivotal strategy in the transition to renewable energy, offering significant potential for clean electricity generation. However, these structures' operation and maintenance (O&M) present unique challenges due to their remote locations and harsh marine environments. For these reasons, it is fundamental to promote the development of autonomous solutions to monitor the health condition of the construction parts, preventing structural damage and accidents. This paper explores the application of Unmanned Aerial Vehicles (UAVs) in the inspection and maintenance of offshore wind turbines, introducing a new strategy for autonomous wind turbine inspection and a simulation environment for testing and training autonomous inspection techniques under a more realistic offshore scenario. Instead of relying on visual information to detect the WT parts during the inspection, this method proposes a three-dimensional (3D) light detection and ranging (LiDAR) method that estimates the wind turbine pose (position, orientation, and blade configuration) and autonomously controls the UAV for a close inspection maneuver. The first tests were carried out mainly in a simulation framework, combining different WT poses, including different orientations, blade positions, and wind turbine movements, and finally, a mixed reality test, where a real vehicle performed a full inspection of a virtual wind turbine.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.