2022
Autores
Oliveira, M; Pedrosa, E; de Aguiar, AP; Rato, DFPD; dos Santos, FN; Dias, P; Santos, V;
Publicação
EXPERT SYSTEMS WITH APPLICATIONS
Abstract
The fusion of data from different sensors often requires that an accurate geometric transformation between the sensors is known. The procedure by which these transformations are estimated is known as sensor calibration. The vast majority of calibration approaches focus on specific pairwise combinations of sensor modalities, unsuitable to calibrate robotic systems containing multiple sensors of varied modalities. This paper presents a novel calibration methodology which is applicable to multi-sensor, multi-modal robotic systems. The approach formulates the calibration as an extended optimization problem, in which the poses of the calibration patterns are also estimated. It makes use of a topological representation of the coordinate frames in the system, in order to recalculate the poses of the sensors throughout the optimization. Sensor poses are retrieved from the combination of geometric transformations which are atomic, in the sense that they are indivisible. As such, we refer to this approach as ATOM - Atomic Transformations Optimization Method. This makes the approach applicable to different calibration problems, such as sensor to sensor, sensor in motion, or sensor to coordinate frame. Additionally, the proposed approach provides advanced functionalities, integrated into ROS, designed to support the several stages of a complete calibration procedure. Results covering several robotic platforms and a large spectrum of calibration problems show that the methodology is in fact general, and achieves calibrations which are as accurate as the ones provided by state of the art methods designed to operate only for specific combinations of pairwise modalities.
2022
Autores
Reis Pereira, M; Tosin, R; Martins, R; dos Santos, FN; Tavares, F; Cunha, M;
Publicação
PLANTS-BASEL
Abstract
Pseudomonas syringae pv. actinidiae (Psa) has been responsible for numerous epidemics of bacterial canker of kiwi (BCK), resulting in high losses in kiwi production worldwide. Current diagnostic approaches for this disease usually depend on visible signs of the infection (disease symptoms) to be present. Since these symptoms frequently manifest themselves in the middle to late stages of the infection process, the effectiveness of phytosanitary measures can be compromised. Hyperspectral spectroscopy has the potential to be an effective, non-invasive, rapid, cost-effective, high-throughput approach for improving BCK diagnostics. This study aimed to investigate the potential of hyperspectral UV-VIS reflectance for in-situ, non-destructive discrimination of bacterial canker on kiwi leaves. Spectral reflectance (325-1075 nm) of twenty plants were obtained with a handheld spectroradiometer in two commercial kiwi orchards located in Portugal, for 15 weeks, totaling 504 spectral measurements. Several modeling approaches based on continuous hyperspectral data or specific wavelengths, chosen by different feature selection algorithms, were tested to discriminate BCK on leaves. Spectral separability of asymptomatic and symptomatic leaves was observed in all multi-variate and machine learning models, including the FDA, GLM, PLS, and SVM methods. The combination of a stepwise forward variable selection approach using a support vector machine algorithm with a radial kernel and class weights was selected as the final model. Its overall accuracy was 85%, with a 0.70 kappa score and 0.84 F-measure. These results were coherent with leaves classified as asymptomatic or symptomatic by visual inspection. Overall, the findings herein reported support the implementation of spectral point measurements acquired in situ for crop disease diagnosis.
2022
Autores
Barroso, TG; Ribeiro, L; Gregorio, H; Monteiro Silva, F; dos Santos, FN; Martins, RC;
Publicação
CHEMOSENSORS
Abstract
Total white blood cells count is an important diagnostic parameter in both human and veterinary medicines. State-of-the-art is performed by flow cytometry combined with light scattering or impedance measurements. Spectroscopy point-of-care has the advantages of miniaturization, low sampling, and real-time hemogram analysis. While white blood cells are in low proportions, while red blood cells and bilirubin dominate spectral information, complicating detection in blood. We performed a feasibility study for the direct detection of white blood cells counts in canine blood by visible-near infrared spectroscopy for veterinary applications, benchmarking current chemometrics techniques (similarity, global and local partial least squares, artificial neural networks and least-squares support vector machines) with self-learning artificial intelligence, introducing data augmentation to overcome the hurdle of knowledge representativity. White blood cells count information is present in the recorded spectra, allowing significant discrimination and equivalence between hemogram and spectra principal component scores. Chemometrics methods correlate white blood cells count to spectral features but with lower accuracy. Self-Learning Artificial Intelligence has the highest correlation (0.8478) and a small standard error of 6.92 x 10(9) cells/L, corresponding to a mean absolute percentage error of 25.37%. Such allows the accurate diagnosis of white blood cells in the range of values of the reference interval (5.6 to 17.8 x 10(9) cells/L) and above. This research is an important step toward the existence of a miniaturized spectral point-of-care hemogram analyzer.
2022
Autores
da Silva, DQ; dos Santos, FN; Filipe, V; Sousa, AJ; Oliveira, PM;
Publicação
ROBOTICS
Abstract
Object identification, such as tree trunk detection, is fundamental for forest robotics. Intelligent vision systems are of paramount importance in order to improve robotic perception, thus enhancing the autonomy of forest robots. To that purpose, this paper presents three contributions: an open dataset of 5325 annotated forest images; a tree trunk detection Edge AI benchmark between 13 deep learning models evaluated on four edge-devices (CPU, TPU, GPU and VPU); and a tree trunk mapping experiment using an OAK-D as a sensing device. The results showed that YOLOR was the most reliable trunk detector, achieving a maximum F1 score around 90% while maintaining high scores for different confidence levels; in terms of inference time, YOLOv4 Tiny was the fastest model, attaining 1.93 ms on the GPU. YOLOv7 Tiny presented the best trade-off between detection accuracy and speed, with average inference times under 4 ms on the GPU considering different input resolutions and at the same time achieving an F1 score similar to YOLOR. This work will enable the development of advanced artificial vision systems for robotics in forestry monitoring operations.
2023
Autores
Pinheiro, I; Aguiar, A; Figueiredo, A; Pinho, T; Valente, A; Santos, F;
Publicação
APPLIED SCIENCES-BASEL
Abstract
Currently, Unmanned Aerial Vehicles (UAVs) are considered in the development of various applications in agriculture, which has led to the expansion of the agricultural UAV market. However, Nano Aerial Vehicles (NAVs) are still underutilised in agriculture. NAVs are characterised by a maximum wing length of 15 centimetres and a weight of fewer than 50 g. Due to their physical characteristics, NAVs have the advantage of being able to approach and perform tasks with more precision than conventional UAVs, making them suitable for precision agriculture. This work aims to contribute to an open-source solution known as Nano Aerial Bee (NAB) to enable further research and development on the use of NAVs in an agricultural context. The purpose of NAB is to mimic and assist bees in the context of pollination. We designed this open-source solution by taking into account the existing state-of-the-art solution and the requirements of pollination activities. This paper presents the relevant background and work carried out in this area by analysing papers on the topic of NAVs. The development of this prototype is rather complex given the interactions between the different hardware components and the need to achieve autonomous flight capable of pollination. We adequately describe and discuss these challenges in this work. Besides the open-source NAB solution, we train three different versions of YOLO (YOLOv5, YOLOv7, and YOLOR) on an original dataset (Flower Detection Dataset) containing 206 images of a group of eight flowers and a public dataset (TensorFlow Flower Dataset), which must be annotated (TensorFlow Flower Detection Dataset). The results of the models trained on the Flower Detection Dataset are shown to be satisfactory, with YOLOv7 and YOLOR achieving the best performance, with 98% precision, 99% recall, and 98% F1 score. The performance of these models is evaluated using the TensorFlow Flower Detection Dataset to test their robustness. The three YOLO models are also trained on the TensorFlow Flower Detection Dataset to better understand the results. In this case, YOLOR is shown to obtain the most promising results, with 84% precision, 80% recall, and 82% F1 score. The results obtained using the Flower Detection Dataset are used for NAB guidance for the detection of the relative position in an image, which defines the NAB execute command.
2023
Autores
Martins, JJ; Silva, M; Santos, F;
Publicação
ROBOT2022: FIFTH IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 1
Abstract
To produce more food and tackle the labor scarcity, agriculture needs safer robots for repetitive and unsafe tasks (such as spraying). The interaction between humans and robots presents some challenges to ensure a certifiable safe collaboration between human-robot, a reliable system that does not damage goods and plants, in a context where the environment is mostly dynamic, due to the constant environment changes. A well-known solution to this problem is the implementation of real-time collision avoidance systems. This paper presents a global overview about state of the art methods implemented in the agricultural environment that ensure human-robot collaboration according to recognised industry standards. To complement are addressed the gaps and possible specifications that need to be clarified in future standards, taking into consideration the human-machine safety requirements for agricultural autonomous mobile robots.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.