Details
Name
João Pedro PiresRole
ResearcherSince
11th April 2023
Nationality
PortugalCentre
Robotics and Autonomous SystemsContacts
+351228340554
joao.p.pires@inesctec.pt
2024
Authors
Morais, R; Martins, JJ; Lima, P; Dias, A; Martins, A; Almeida, J; Silva, E;
Publication
OCEANS 2024 - SINGAPORE
Abstract
Solar energy will contribute to global economic growth, increasing worldwide photovoltaic (PV) solar energy production. More recently, one of the outstanding energy achievements of the last decade has been the development of floating photovoltaic panels. These panels differ from conventional (terrestrial) panels because they occupy space in a more environmentally friendly way, i.e., aquatic areas. In contrast, land areas are saved for other applications, such as construction or agriculture. Developing autonomous inspection systems using unmanned aerial vehicles (UAVs) represents a significant step forward in solar PV technology. Given the frequently remote and difficult-to-access locations, traditional inspection methods are no longer practical or suitable. Responding to these challenges, an innovative inspection framework was developed to autonomously inspect photovoltaic plants (offshore) with a Vertical Takeoff and Landing (VTOL) UAV. This work explores two different methods of autonomous aerial inspection, each adapted to specific scenarios, thus increasing the adaptability of the inspection process. During the flight, the aerial images are evaluated in real-time for the autonomous detection of the photovoltaic modules and the detection of possible faults. This mechanism is crucial for making decisions and taking immediate corrective action. An offshore simulation environment was developed to validate the implemented system.
2024
Authors
Dias, A; Martins, J; Antunes, J; Moura, A; Almeida, J;
Publication
2024 7th Iberian Robotics Conference, ROBOT 2024
Abstract
This paper presents the Unmanned Aerial Vehicle (UAV) MANTIS, developed for indoor inventory management in large-scale warehouses. MANTIS integrates a visual odometry (VIO) system for precise localization, thus allowing indoor navigation in complex environments. The mechanical design was optimized for stability and maneuverability in confined spaces, incorporating a lightweight frame and efficient propulsion system. The UAV is equipped with an array of sensors, including a 2D LiDAR, six cameras, and two IMUs, which ensures accurate data collection. The VIO system integrates visual data with inertial measurements to maintain robust, drift-free localization. A behavior tree (BT) framework is responsible for the UAV mission planner assigned to the vehicle, which can be flexible and adaptive in response to dynamic warehouse conditions. To validate the accuracy and reliability of the VIO system, we conducted a series of tests using an OptiTrack motion capture system as a ground truth reference. Comparative analysis between the VIO and OptiTrack data demonstrates the efficacy of the VIO system in maintaining accurate localization. The results prove MANTIS, with the required payload sensors, is a viable solution for efficient and autonomous inventory management. © 2024 IEEE.
2024
Authors
Martins, J; Amaral, A; Dias, A;
Publication
2024 7th Iberian Robotics Conference, ROBOT 2024
Abstract
Unmanned Aerial Vehicle (UAV) applications, particularly for indoor tasks such as inventory management, infrastructure inspection, and emergency response, are becoming increasingly complex with dynamic environments and their different elements. During operation, the vehicle's response depends on various decisions regarding its surroundings and the task goal. Reinforcement Learning techniques can solve this decision problem by helping build more reactive, adaptive, and efficient navigation operations. This paper presents a framework to simulate the navigation of a UAV in an operational environment, training and testing it with reinforcement learning models for further deployment in the real drone. With the support of the 3D simulator Gazebo and the framework Robot Operating System (ROS), we developed a training environment conceivably simple and fast or more complex and dynamic, explicit as the real-world scenario. The multi-environment simulation runs in parallel with the Deep Reinforcement Learning (DRL) algorithm to provide feedback for the training. TD3, DDPG, PPO, and PPO+LSTM were trained to validate the framework training, testing, and deployment in an indoor scenario. © 2024 IEEE.
2023
Authors
Martins, JJ; Silva, M; Santos, F;
Publication
ROBOT2022: FIFTH IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 1
Abstract
To produce more food and tackle the labor scarcity, agriculture needs safer robots for repetitive and unsafe tasks (such as spraying). The interaction between humans and robots presents some challenges to ensure a certifiable safe collaboration between human-robot, a reliable system that does not damage goods and plants, in a context where the environment is mostly dynamic, due to the constant environment changes. A well-known solution to this problem is the implementation of real-time collision avoidance systems. This paper presents a global overview about state of the art methods implemented in the agricultural environment that ensure human-robot collaboration according to recognised industry standards. To complement are addressed the gaps and possible specifications that need to be clarified in future standards, taking into consideration the human-machine safety requirements for agricultural autonomous mobile robots.
2023
Authors
Moura, A; Antunes, J; Martins, JJ; Dias, A; Martins, A; Almeida, JM; Silva, E;
Publication
OCEANS 2023 - LIMERICK
Abstract
The use of autonomous vehicles in maritime operations is a technological challenge. In the particular case of autonomous aerial vehicles (UAVs), their application ranges from inspection and surveillance of offshore power plants, and marine life observation, to search and rescue missions. Manually landing UAVs onboard water vessels can be very challenging due to limited space onboard and wave agitation. This paper proposes an autonomous solution for the task of landing commercial multicopter UAVs with onboard cameras on water vessels, based on the detection of a custom landing platform with computer vision techniques. The autonomous landing behavior was tested in real conditions, using a research vessel at sea, where the UAV was able to detect, locate, and safely land on top of the developed landing platform.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.