Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by Joaquim João Sousa

2020

Automatic Grapevine Trunk Detection on UAV-Based Point Cloud

Authors
Jurado, JM; Padua, L; Feito, FR; Sousa, JJ;

Publication
REMOTE SENSING

Abstract
The optimisation of vineyards management requires efficient and automated methods able to identify individual plants. In the last few years, Unmanned Aerial Vehicles (UAVs) have become one of the main sources of remote sensing information for Precision Viticulture (PV) applications. In fact, high resolution UAV-based imagery offers a unique capability for modelling plant's structure making possible the recognition of significant geometrical features in photogrammetric point clouds. Despite the proliferation of innovative technologies in viticulture, the identification of individual grapevines relies on image-based segmentation techniques. In that way, grapevine and non-grapevine features are separated and individual plants are estimated usually considering a fixed distance between them. In this study, an automatic method for grapevine trunk detection, using 3D point cloud data, is presented. The proposed method focuses on the recognition of key geometrical parameters to ensure the existence of every plant in the 3D model. The method was tested in different commercial vineyards and to push it to its limit a vineyard characterised by several missing plants along the vine rows, irregular distances between plants and occluded trunks by dense vegetation in some areas, was also used. The proposed method represents a disruption in relation to the state of the art, and is able to identify individual trunks, posts and missing plants based on the interpretation and analysis of a 3D point cloud. Moreover, a validation process was carried out allowing concluding that the method has a high performance, especially when it is applied to 3D point clouds generated in phases in which the leaves are not yet very dense (January to May). However, if correct flight parametrizations are set, the method remains effective throughout the entire vegetative cycle.

2020

VisWebDrone: A Web Application for UAV Photogrammetry Based on Open-Source Software

Authors
Guimaraes, N; Padua, L; Adao, T; Hruska, J; Peres, E; Sousa, JJ;

Publication
ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION

Abstract
Currently, the use of free and open-source software is increasing. The flexibility, availability, and maturity of this software could be a key driver to develop useful and interesting solutions. In general, open-source solutions solve specific tasks that can replace commercial solutions, which are often very expensive. This is even more noticeable in areas requiring analysis and manipulation/visualization of a large volume of data. Considering that there is a major gap in the development of web applications for photogrammetric processing, based on open-source technologies that offer quality results, the application presented in this article is intended to explore this niche. Thus, in this article a solution for photogrammetric processing is presented, based on the integration of MicMac, GeoServer, Leaflet, and Potree software. The implemented architecture, focusing on open-source software for data processing and for graphical manipulation, visualization, measuring, and analysis, is presented in detail. To assess the results produced by the proposed web application, a case study is presented, using imagery acquired from an unmanned aerial vehicle in two different areas.

2020

MYSENSE-WEBGIS: A GRAPHICAL MAP LAYERING-BASED DECISION SUPPORT TOOL FOR AGRICULTURE

Authors
Adao, T; Soares, A; Padua, L; Guimaraes, N; Pinho, T; Sousa, JJ; Morais, R; Peres, E;

Publication
IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM

Abstract
Developed focusing agriculture sustainability, mySense is a comprehensive close-range sensor-based data management environment to improve precision farming practices. It integrates discussion platforms for quick problem solving through experts support and a computational intelligence layer for multipurpose application (e.g. vine variety discrimination, plant disease detection and identification). Attending the need for keeping track of agricultural crops not only based on close-range sensing but also at a macro perspective, mySense was complemented with proper functionalities to unlock macro-monitoring features, through the implementation of a Web-based Geographical Information System (WebGIS) planned as a sidekick application that provides agriculture professionals with visual decision support tools over remote sensed data. This paper presents and discusses its specification and implementation.

2020

ESTIMATION OF LEAF AREA INDEX IN CHESTNUT TREES USING MULTISPECTRAL DATA FROM AN UNMANNED AERIAL VEHICLE

Authors
Padua, L; Marques, P; Martins, L; Sousa, A; Peres, E; Sousa, JJ;

Publication
IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM

Abstract
Individual tree segmentation is a challenging task due to the labour-intensive and time-consuming work required. Remote sensing data acquired from sensors coupled in unmanned aerial vehicles (UAV) constitutes a viable alternative to provide a quicker data acquisition, covering broader areas in a shorter period of time. This study aims to use UAV-based multispectral imagery to automatically identify individual trees in a chestnut stand. Tree parameters were estimated allowing its characterization. The leaf area index (LAI) was measured and was correlated with the estimated parameters. A good correlation was found for NDVI (R-2 = 0.76), while this relationship was less evident in the tree crown area and tree height. This way, our results indicate that the use of UAV-based multispectral imagery is a quick and reliable way to determine canopy structural parameters and LAI of chestnut trees.

2020

TARGET INFLUENCE ON GROUND CONTROL POINTS (GCPs) IDENTIFICATION IN AERIAL IMAGES

Authors
Hruska, J; Padua, L; Adao, T; Peres, E; Martinho, J; Sousa, JJ;

Publication
IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM

Abstract
Unmanned aerial vehicles (UAVs) are used nowadays as a standard tool to derive very high-resolution geospatial data. However, UAV payload limitation imposes the use of not such reliable hardware affecting the georeferencing precision. In the literature it is possible to find numerous studies investigating the parameters influencing UAV-based products quality. Even if new photogrammetry methods could, in theory, avoid the use of ground control points (GCPs), they still play a key role to assure quality products. Nevertheless, usually only the number and distribution of GCPs are taking into account, since both change the geometric accuracy of the final products. In order to improve the understanding of the actual influence of GCPs, in this study we evaluate how can different physical characteristics affect GCPs identification in aerial images. The results demonstrate that GCPs' color, material, size and shape, among others, may influence a precise identification in aerial imagery.

2020

VINEYARD CLASSIFICATION USING MACHINE LEARNING TECHNIQUES APPLIED TO RGB-UAV IMAGERY

Authors
Padua, L; Adao, T; Hruska, J; Guimaraes, N; Marques, P; Peres, E; Sousa, JJ;

Publication
IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM

Abstract
In this study machine learning methods were applied to RGB data obtained by an unmanned aerial vehicle (UAV) to assess this effectiveness in vineyard classification. The very high-resolution UAV-based imagery was subjected to a photogrammetric processing allowing the generation of different outcomes: orthophoto mosaic, crop surface model and five vegetation indices. The orthophoto mosaic was used in an object-based image analysis approach to group pixels with similar values into objects. Three machine learning techniques-support vector machine (SVM), random forest (RF) and artificial neural network (ANN)-were applied to classify the data into four classes: grapevine, shadow, soil and other vegetation. The data were divided with 22% (n=240, 60 per class) for training purposes and 78% (n = 850) for testing purposes. The mean value of the objects from each feature were used to create a dataset for prediction. The results demonstrated that both RF and ANN models showed a good performance, yet the RF classifier achieved better results.

  • 13
  • 25