Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
About

About

André Silva Pinto de Aguiar received the Electrical Engineering MSc in 2019, from the Faculty of Engineering of the University of Porto specialized in robotics, and a PhD in Electrical Engineering in 2023 from the UTAD university. He is a researcher at the Centre for Robotics in Industry and Intelligent Systems of INESC TEC in Porto. His current research interests are focused on robotics navigation, Simultaneous Localization and Mapping, Computer Vision, Point Cloud processing and Deep Learning. André Silva Aguiar is author of more than 20 indexed articles and participated in more than 10 National and European projects in the agricultural robotics field. At the moment André is leading Orioos, a project that was awarded by EUSPA winning the myEUspace competition.

Interest
Topics
Details

Details

  • Name

    André Silva Aguiar
  • Role

    Assistant Researcher
  • Since

    01st September 2019
  • Nationality

    Portugal
  • Contacts

    +351220413317
    andre.s.aguiar@inesctec.pt
011
Publications

2024

Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower

Authors
Sarmento, J; dos Santos, FN; Aguiar, AS; Filipe, V; Valente, A;

Publication
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS

Abstract
Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.

2023

Nano Aerial Vehicles for Tree Pollination

Authors
Pinheiro, I; Aguiar, A; Figueiredo, A; Pinho, T; Valente, A; Santos, F;

Publication
APPLIED SCIENCES-BASEL

Abstract
Currently, Unmanned Aerial Vehicles (UAVs) are considered in the development of various applications in agriculture, which has led to the expansion of the agricultural UAV market. However, Nano Aerial Vehicles (NAVs) are still underutilised in agriculture. NAVs are characterised by a maximum wing length of 15 centimetres and a weight of fewer than 50 g. Due to their physical characteristics, NAVs have the advantage of being able to approach and perform tasks with more precision than conventional UAVs, making them suitable for precision agriculture. This work aims to contribute to an open-source solution known as Nano Aerial Bee (NAB) to enable further research and development on the use of NAVs in an agricultural context. The purpose of NAB is to mimic and assist bees in the context of pollination. We designed this open-source solution by taking into account the existing state-of-the-art solution and the requirements of pollination activities. This paper presents the relevant background and work carried out in this area by analysing papers on the topic of NAVs. The development of this prototype is rather complex given the interactions between the different hardware components and the need to achieve autonomous flight capable of pollination. We adequately describe and discuss these challenges in this work. Besides the open-source NAB solution, we train three different versions of YOLO (YOLOv5, YOLOv7, and YOLOR) on an original dataset (Flower Detection Dataset) containing 206 images of a group of eight flowers and a public dataset (TensorFlow Flower Dataset), which must be annotated (TensorFlow Flower Detection Dataset). The results of the models trained on the Flower Detection Dataset are shown to be satisfactory, with YOLOv7 and YOLOR achieving the best performance, with 98% precision, 99% recall, and 98% F1 score. The performance of these models is evaluated using the TensorFlow Flower Detection Dataset to test their robustness. The three YOLO models are also trained on the TensorFlow Flower Detection Dataset to better understand the results. In this case, YOLOR is shown to obtain the most promising results, with 84% precision, 80% recall, and 82% F1 score. The results obtained using the Flower Detection Dataset are used for NAB guidance for the detection of the relative position in an image, which defines the NAB execute command.

2022

Localization and Mapping on Agriculture Based on Point-Feature Extraction and Semiplanes Segmentation From 3D LiDAR Data

Authors
Aguiar, AS; dos Santos, FN; Sobreira, H; Boaventura Cunha, J; Sousa, AJ;

Publication
FRONTIERS IN ROBOTICS AND AI

Abstract
Developing ground robots for agriculture is a demanding task. Robots should be capable of performing tasks like spraying, harvesting, or monitoring. However, the absence of structure in the agricultural scenes challenges the implementation of localization and mapping algorithms. Thus, the research and development of localization techniques are essential to boost agricultural robotics. To address this issue, we propose an algorithm called VineSLAM suitable for localization and mapping in agriculture. This approach uses both point- and semiplane-features extracted from 3D LiDAR data to map the environment and localize the robot using a novel Particle Filter that considers both feature modalities. The numeric stability of the algorithm was tested using simulated data. The proposed methodology proved to be suitable to localize a robot using only three orthogonal semiplanes. Moreover, the entire VineSLAM pipeline was compared against a state-of-the-art approach considering three real-world experiments in a woody-crop vineyard. Results show that our approach can localize the robot with precision even in long and symmetric vineyard corridors outperforming the state-of-the-art algorithm in this context.

2022

FollowMe - A Pedestrian Following Algorithm for Agricultural Logistic Robots

Authors
Sarmento, J; Dos Santos, FN; Aguiar, AS; Sobreira, H; Regueiro, CV; Valente, A;

Publication
2022 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)

Abstract
In Industry 4.0 and Agriculture 4.0, there are logistics areas where robots can play an important role, for example by following a person at a certain distance. These robots can transport heavy tools or simply help collect certain items, such as harvested fruits. The use of Ultra Wide Band (UWB) transceivers as range sensors is becoming very common in the field of robotics, i.e. for localising goods and machines. Since UWB technology has very accurate time resolution, it is advantageous for techniques such as Time Of Arrival (TOA), which can estimate distance by measuring the time between message frames. In this work, UWB transceivers are used as range sensors to track pedestrians/operators. In this work we propose the use of two algorithms for relative localization, between a person and robot. Both algorithms use a similar 2dimensional occupancy grid, but differ in filtering. The first is based on a Extended Kalman Filter (EKF) that fuses the range sensor with odometry. The second is based on an Histogram Filter that calculates the pedestrian position by discretizing the state space in well-defined regions. Finally, a controller is implemented to autonomously command the robot. Both approaches are tested and compared on a real differential drive robot. Both proposed solutions are able to follow a pedestrian at speeds of 0.1m/s, and are promising solutions to complement other solutions based on cameras and LiDAR.

2022

Path Planning with Hybrid Maps for processing and memory usage optimisation

Authors
Santos, LC; Santos, FN; Aguiar, AS; Valente, A; Costa, P;

Publication
2022 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)

Abstract
Robotics will play an essential role in agriculture. Deploying agricultural robots on the farm is still a challenging task due to the terrain's irregularity and size. Optimal path planning solutions may fail in larger terrains due to memory requirements as the search space increases. This work presents a novel open-source solution called AgRob Topologic Path Planner, which is capable of performing path planning operations using a hybrid map with topological and metric representations. A local A* algorithm pre-plans and saves local paths in local metric maps, saving them into the topological structure. Then, a graph-based A* performs a global search in the topological map, using the saved local paths to provide the full trajectory. Our results demonstrate that this solution could handle large maps (5 hectares) using just 0.002 % of the search space required by a previous solution.