2024
Autores
Minhoto, V; Santos, T; Silva, LTE; Rodrigues, P; Arrais, A; Amaral, A; Dias, A; Almeida, J; Cunha, JPS;
Publicação
ROBOT 2023: SIXTH IBERIAN ROBOTICS CONFERENCE, VOL 2
Abstract
Over the last few years, Man-Machine collaborative systems have been increasingly present in daily routines. In these systems, one operator usually controls the machine through explicit commands and assesses the information through a graphical user interface. Direct & implicit interaction between the machine and the user does not exist. This work presents a man-machine symbiotic concept & system where such implicit interaction is possible targeting search and rescue scenarios. Based on measuring physiological variables (e.g. body movement or electrocardiogram) through wearable devices, this system is capable of computing the psycho-physiological state of the human and autonomously identify abnormal situations (e.g. fall or stress). This information is injected into the control loop of the machine that can alter its behavior according to it, enabling an implicit man-machine communication mechanism. A proof of concept of this system was tested at the ARTEX (ARmy Technological EXperimentation) exercise organized by the Portuguese Army involving a military agent and a drone. During this event the soldier was equipped with a kit of wearables that could monitor several physiological variables and automatically detect a fall during a mission. This information was continuously sent to the drone that successfully identified this abnormal situation triggering the take-off and a situation awareness fly-by flight pattern, delivering a first-aid kit to the soldier in case he did not recover after a pre-determined time period. The results were very positive, proving the possibility and feasibility of a symbiotic system between humans and machines.
2024
Autores
Morais, R; Martins, JJ; Lima, P; Dias, A; Martins, A; Almeida, J; Silva, E;
Publicação
OCEANS 2024 - SINGAPORE
Abstract
Solar energy will contribute to global economic growth, increasing worldwide photovoltaic (PV) solar energy production. More recently, one of the outstanding energy achievements of the last decade has been the development of floating photovoltaic panels. These panels differ from conventional (terrestrial) panels because they occupy space in a more environmentally friendly way, i.e., aquatic areas. In contrast, land areas are saved for other applications, such as construction or agriculture. Developing autonomous inspection systems using unmanned aerial vehicles (UAVs) represents a significant step forward in solar PV technology. Given the frequently remote and difficult-to-access locations, traditional inspection methods are no longer practical or suitable. Responding to these challenges, an innovative inspection framework was developed to autonomously inspect photovoltaic plants (offshore) with a Vertical Takeoff and Landing (VTOL) UAV. This work explores two different methods of autonomous aerial inspection, each adapted to specific scenarios, thus increasing the adaptability of the inspection process. During the flight, the aerial images are evaluated in real-time for the autonomous detection of the photovoltaic modules and the detection of possible faults. This mechanism is crucial for making decisions and taking immediate corrective action. An offshore simulation environment was developed to validate the implemented system.
2024
Autores
Santos, T; Cunha, T; Dias, A; Moreira, AP; Almeida, J;
Publicação
SENSORS
Abstract
Inspecting and maintaining power lines is essential for ensuring the safety, reliability, and efficiency of electrical infrastructure. This process involves regular assessment to identify hazards such as damaged wires, corrosion, or vegetation encroachment, followed by timely maintenance to prevent accidents and power outages. By conducting routine inspections and maintenance, utilities can comply with regulations, enhance operational efficiency, and extend the lifespan of power lines and equipment. Unmanned Aerial Vehicles (UAVs) can play a relevant role in this process by increasing efficiency through rapid coverage of large areas and access to difficult-to-reach locations, enhanced safety by minimizing risks to personnel in hazardous environments, and cost-effectiveness compared to traditional methods. UAVs equipped with sensors such as visual and thermographic cameras enable the accurate collection of high-resolution data, facilitating early detection of defects and other potential issues. To ensure the safety of the autonomous inspection process, UAVs must be capable of performing onboard processing, particularly for detection of power lines and obstacles. In this paper, we address the development of a deep learning approach with YOLOv8 for power line detection based on visual and thermographic images. The developed solution was validated with a UAV during a power line inspection mission, obtaining mAP@0.5 results of over 90.5% on visible images and over 96.9% on thermographic images.
2024
Autores
Oliveira, A; Dias, A; Santos, T; Rodrigues, P; Martins, A; Almeida, J;
Publicação
DRONES
Abstract
The deployment of offshore wind turbines (WTs) has emerged as a pivotal strategy in the transition to renewable energy, offering significant potential for clean electricity generation. However, these structures' operation and maintenance (O&M) present unique challenges due to their remote locations and harsh marine environments. For these reasons, it is fundamental to promote the development of autonomous solutions to monitor the health condition of the construction parts, preventing structural damage and accidents. This paper explores the application of Unmanned Aerial Vehicles (UAVs) in the inspection and maintenance of offshore wind turbines, introducing a new strategy for autonomous wind turbine inspection and a simulation environment for testing and training autonomous inspection techniques under a more realistic offshore scenario. Instead of relying on visual information to detect the WT parts during the inspection, this method proposes a three-dimensional (3D) light detection and ranging (LiDAR) method that estimates the wind turbine pose (position, orientation, and blade configuration) and autonomously controls the UAV for a close inspection maneuver. The first tests were carried out mainly in a simulation framework, combining different WT poses, including different orientations, blade positions, and wind turbine movements, and finally, a mixed reality test, where a real vehicle performed a full inspection of a virtual wind turbine.
2024
Autores
Dias, A; Martins, JJ; Antunes, J; Moura, A; Almeida, J;
Publicação
2024 7TH IBERIAN ROBOTICS CONFERENCE, ROBOT 2024
Abstract
This paper presents the Unmanned Aerial Vehicle (UAV) MANTIS, developed for indoor inventory management in large-scale warehouses. MANTIS integrates a visual odometry (VIO) system for precise localization, thus allowing indoor navigation in complex environments. The mechanical design was optimized for stability and maneuverability in confined spaces, incorporating a lightweight frame and efficient propulsion system. The UAV is equipped with an array of sensors, including a 2D LiDAR, six cameras, and two IMUs, which ensures accurate data collection. The VIO system integrates visual data with inertial measurements to maintain robust, drift-free localization. A behavior tree (BT) framework is responsible for the UAV mission planner assigned to the vehicle, which can be flexible and adaptive in response to dynamic warehouse conditions. To validate the accuracy and reliability of the VIO system, we conducted a series of tests using an OptiTrack motion capture system as a ground truth reference. Comparative analysis between the VIO and OptiTrack data demonstrates the efficacy of the VIO system in maintaining accurate localization. The results prove MANTIS, with the required payload sensors, is a viable solution for efficient and autonomous inventory management.
2024
Autores
Martins, JJ; Amaral, A; Dias, A;
Publicação
2024 7TH IBERIAN ROBOTICS CONFERENCE, ROBOT 2024
Abstract
Unmanned Aerial Vehicle (UAV) applications, particularly for indoor tasks such as inventory management, infrastructure inspection, and emergency response, are becoming increasingly complex with dynamic environments and their different elements. During operation, the vehicle's response depends on various decisions regarding its surroundings and the task goal. Reinforcement Learning techniques can solve this decision problem by helping build more reactive, adaptive, and efficient navigation operations. This paper presents a framework to simulate the navigation of a UAV in an operational environment, training and testing it with reinforcement learning models for further deployment in the real drone. With the support of the 3D simulator Gazebo and the framework Robot Operating System (ROS), we developed a training environment conceivably simple and fast or more complex and dynamic, explicit as the real-world scenario. The multi-environment simulation runs in parallel with the Deep Reinforcement Learning (DRL) algorithm to provide feedback for the training. TD3, DDPG, PPO, and PPO+LSTM were trained to validate the framework training, testing, and deployment in an indoor scenario.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.