2019
Autores
Mendes, JM; dos Santos, FN; Ferraz, NA; do Couto, PM; dos Santos, RM;
Publicação
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS
Abstract
Placing ground robots to work in steep slope vineyards is a complex challenge. The Global Positioning System (GPS) signal is not always available and accurate. A reliable localization approach to detect natural features for this environment is required. This paper presents an improved version of a visual detector for Vineyards Trunks and Masts (ViTruDe) and, a robot able to cope pruning actions in steep slope vineyards (AgRob V16). In addition, it presents an augmented data-set for other localization and mapping algorithm benchmarks. ViTruDe accuracy is higher than 95% under our experiments. Under a simulated runtime test, the accuracy lies between 27% - 96% depending on ViTrude parametrization. This approach can feed a localization system to solve a GPS signal absence. The ViTruDe detector also considers economic constraints and allows to develop cost-effective robots. The augmented training and datasets are publicly available for future research work.
2019
Autores
Aguiar, A; Sousa, A; dos Santos, FN; Oliveira, M;
Publicação
2019 19TH IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC 2019)
Abstract
Developing ground robots for crop monitoring and harvesting in steep slope vineyards is a complex challenge due to two main reasons: harsh condition of the terrain and unstable localization accuracy obtained with Global Navigation Satellite System. In this context, a reliable localization system requires an accurate and redundant information to Global Navigation Satellite System and wheel odometry based system. To pursue this goal we benchmark 3 well known Visual Odometry methods with 2 datasets. Two of these are feature-based Visual Odometry algorithms: Libviso2 and SVO 2.0. The third is an appearance-based Visual Odometry algorithm called DSO. In monocular Visual Odometry, two main problems appear: pure rotations and scale estimation. In this paper, we focus on the first issue. To do so, we propose a Kalman Filter to fuse a single gyroscope with the output pose of monocular Visual Odometry, while estimating gyroscope's bias continuously. In this approach we propose a non-linear noise variation that ensures that bias estimation is not affected by Visual Odometry resultant rotations. We compare and discuss the three unchanged methods and the three methods with the proposed additional Kalman Filter. For tests, two public datasets are used: the Kitti dataset and another built in-house. Results show that our additional Kalman Filter highly improves Visual Odometry performance in rotation movements.
2019
Autores
Azevedo, F; Shinde, P; Santos, L; Mendes, J; Santos, FN; Mendonca, H;
Publicação
2019 19TH IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC 2019)
Abstract
Developing ground robots for crop monitoring and harvesting in steep slope vineyards is a complex challenge due to two main reasons: harsh condition of the terrain and unstable localization accuracy obtained with Global Navigation Satellite System (GNSS). In this context, a reliable localization system requires an accurate detector for high density of natural/artificial features. In previous works, we presented a novel visual detector for Vineyards Trunks and Masts (ViTruDe) with high levels of detection accuracy. However, its implementation on the most common processing units -central processing units (CPU), using a standard programming language (C/C++), is unable to reach the processing efficiency requirements for real time operation. In this work, we explored parallelization capabilities of processing units, such as graphics processing units (GPU), in order to accelerate the processing time of ViTruDe. This work gives a general perspective on how to parallelize a generic problem in a GPU based solution, while exploring its efficiency when applied to the problem at hands. The ViTruDe detector for GPU was developed considering the constraints of a cost-effective robot to carry-out crop monitoring tasks in steep slope vineyard environments. We compared the proposed ViTruDe implementation on GPU using Compute Unified Compute Unified Device Architecture(CUDA) and CPU, and the achieved solution is over eighty times faster than its CPU counterpart. The training and test data are made public for future research work. This approach is a contribution for an accurate and reliable localization system that is GNSS-free.
2019
Autores
Santos, L; Santos, FN; Magalhaes, S; Costa, P; Reis, R;
Publicação
2019 19TH IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC 2019)
Abstract
Robotic platforms are being developed for precision agriculture, to execute repetitive and long term tasks. Autonomous monitoring, pruning, spraying and harvesting are some of these agricultural tasks, which requires an advanced path planning system aware of maximum robot capabilities (mobile platform and arms), terrain slopes and plant/fruits position. The state of the art path planning systems have two limitations: are not optimized for large regions and the path planning is not aware of agricultural tasks requirements. This work presents two solutions to overcome these limitations. It considers the VGR2TO (Vineyard Grid Map to Topological) approach to extract from a 2D grid map a topological map, to reduce the total amount of memory needed by the path planning algorithm and to reduce path search space. Besides, introduces an extension to the chosen algorithm, the Astar algorithm, to ensure a safe path and a maximum distance from the vine trees to enable robotic operations on the tree and its fruits.
2019
Autores
Martins, RC; Magalhães, S; Jorge, P; Barroso, T; Santos, F;
Publicação
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
Metabolomics is paramount for precision agriculture. Knowing the metabolic state of the vine and its implication for grape quality is of outermost importance for viticulture and wine industry. The MetBots system is a metabolomics precision agriculture platform, for automated monitoring of vineyards, providing geo-referenced metabolic images that are correlated and interpreted by an artificial intelligence self-learning system for aiding precise viticultural practices. Results can further be used to analyze the plant metabolic response by genome-scale models. In this research, we introduce the system main components: (i) robotic platform; (ii) autonomous navigation; (iii) sampling arm manipulation; (iv) spectroscopy systems; and (v) non-invasive, real-time metabolic hyper-spectral imaging monitoring of vineyards. The full potential of the Metbots system is revealed when metabolic data and images are analyzed by big data AI and systems biology vine plant models, establishing a new age of molecular biology precision agriculture. © Springer Nature Switzerland AG 2019.
2019
Autores
Mendes, JM; Filipe, VM; dos Santos, FN; dos Santos, RM;
Publicação
PROGRESS IN ARTIFICIAL INTELLIGENCE, EPIA 2019, PT I
Abstract
In order to determine the physiological state of a plant it is necessary to monitor it throughout the developmental period. One of the main parameters to monitor is the Leaf Area Index (LAI). The objective of this work was the development of a non-destructive methodology for the LAI estimation in wine growing. This method is based on stereo images that allow to obtain a bard 3D representation, in order to facilitate the segmentation process, since to perform this process only based on color component becomes practically impossible due to the high complexity of the application environment. In addition, the Normalized Difference Vegetation Index will be used to distinguish the regions of the trunks and leaves. As an low-cost and non-evasive method, it becomes a promising solution for LAI estimation in order to monitor the productivity changes and the impacts of climatic conditions in the vines growth. © Springer Nature Switzerland AG 2019.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.