2017
Authors
Matos, A; Silva, E; Almeida, J; Martins, A; Ferreira, H; Ferreira, B; Alves, J; Dias, A; Fioravanti, S; Bertin, D; Lobo, V;
Publication
Search and Rescue Robotics - From Theory to Practice
Abstract
2016
Authors
Marques, MM; Parreira, R; Lobo, V; Martins, A; Matos, A; Cruz, N; Almeida, JM; Alves, JC; Silva, E; Bedkowski, J; Majek, K; Pelka, M; Musialik, P; Ferreira, H; Dias, A; Ferreira, B; Amaral, G; Figueiredo, A; Almeida, R; Silva, F; Serrano, D; Moreno, G; De Cubber, G; Balta, H; Beglerovic, H;
Publication
OCEANS 2016 - SHANGHAI
Abstract
Today, in our landscape perception exists a gap that needs to be fulfilled. It's important to increase the coverage, temporal and spatial resolution in order to cover this gap, as well as reduce costs with human resources that usually take this kind of tasks. Unmanned Autonomous vehicles with their inherent autonomy and reduced needs of human and communication resources, can provide additional capabilities and a new innovative solution to this problem This paper presents and describes the participation of ICARUS Team at euRathlon 2015 and the importance of this type of events performed with multiple unnamed systems.
2017
Authors
Doroftei, D; Cubber, GD; Wagemans, R; Matos, A; Silva, E; Lobo, V; Cardoso, G; Chintamani, K; Govindaraj, S; Gancet, J; Serrano, D;
Publication
Search and Rescue Robotics - From Theory to Practice
Abstract
2017
Authors
Pinto, AM; Costa, PG; Correia, MV; Matos, AC; Moreira, AP;
Publication
ROBOTICS AND AUTONOMOUS SYSTEMS
Abstract
Recent advances in visual motion detection and interpretation have made possible the rising of new robotic systems for autonomous and active surveillance. In this line of research, the current work discusses motion perception by proposing a novel technique that analyzes dense flow fields and distinguishes several regions with distinct motion models. The method is called Wise Optical Flow Clustering (WOFC) and extracts the moving objects by performing two consecutive operations: evaluating and resetting. Motion properties of the flow field are retrieved and described in the evaluation phase, which provides high level information about the spatial segmentation of the flow field. During the resetting operation, these properties are combined and used to feed a guided segmentation approach. The WOFC requires information about the number of motion models and, therefore, this paper introduces a model selection method based on a Bayesian approach that balances the model's fitness and complexity. It combines the correlation of a histogram-based analysis with the decay ratio of the normalized entropy criterion. This approach interprets the flow field and gives an estimative about the number of moving objects. The experiments conducted in a realistic environment have proved that the WOFC presents several advantages that meet the requirements of common robotic and surveillance applications: is computationally efficient and provides a pixel-wise segmentation, comparatively to other state-of-the-art methods.
2016
Authors
de Pinho, MD; Foroozandeh, Z; Matos, A;
Publication
2016 IEEE 55TH CONFERENCE ON DECISION AND CONTROL (CDC)
Abstract
Here we propose a simplified model for the path planning of an Autonomous Underwater Vehicle (AUV) in an horizontal plane when ocean currents are considered. The model includes kinematic equations and a simple dynamic equation. Our problem of interest is a minimum time problem with state constraints where the control appears linearly. This problem is solved numerically using the direct method. We extract various tests from the Maximum Principle that are then used to validate the numerical solution. In contrast to many other literature we apply the Maximum Principle as defined in [9].
2017
Authors
Cubber, GD; Doroftei, D; Rudin, K; Berns, K; Matos, A; Serrano, D; Sanchez, J; Govindaraj, S; Bedkowski, J; Roda, R; Silva, E; Ourevitch, S;
Publication
Search and Rescue Robotics - From Theory to Practice
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.