Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by José Miguel Almeida

2013

Distributed Active Traction Control System Applied to the RoboCup Middle Size League

Authors
Almeida, J; Dias, A; Martins, A; Sequeira, J; Silva, E;

Publication
INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS

Abstract
This work addresses the problem of traction control in mobile wheeled robots in the particular case of the RoboCup Middle Size League (MSL). The slip control problem is formulated using simple friction models for ISePorto Team Robots with a differential wheel configuration. Traction was also characterized experimentally in the MSL scenario for relevant game events. This work proposes a hierarchical traction control architecture which relies on local slip detection and control at each wheel, with relevant information being relayed to a higher level responsible for global robot motion control. A dedicated one axis control embedded hardware subsystem allowing complex local control, high frequency current sensing and odometric information procession was developed. This local axis control board is integrated in a distributed system using CAN bus communications. The slipping observer was implemented in the axis control hardware nodes integrated in the ISePorto Robots and was used to control and detect loss of traction. An external vision system was used to perform a qualitative analysis of the slip detection and observer performance results are presented.

2013

Field experiments for marine casualty detection with autonomous surface vehicles

Authors
Martins, A; Dias, A; Almeida, J; Ferreira, H; Almeida, C; Amaral, G; Machado, D; Sousa, J; Pereira, P; Matos, A; Lobo, V; Silva, E;

Publication
2013 OCEANS - SAN DIEGO

Abstract
In this paper we present a set of field tests for detection of human in the water with an unmanned surface vehicle using infrared and color cameras. These experiments aimed to contribute in the development of victim target tracking and obstacle avoidance for unmanned surface vehicles operating in marine search and rescue missions. This research is integrated in the work conducted in the European FP7 research project Icarus aiming to develop robotic tools for large scale rescue operations. The tests consisted in the use of the ROAZ unmanned surface vehicle equipped with a precision GPS system for localization and both visible spectrum and IR cameras to detect the target. In the experimental setup, the test human target was deployed in the water wearing a life vest and a diver suit (thus having lower temperature signature in the body except hands and head) and was equipped with a GPS logger. Multiple target approaches were performed in order to test the system with different sun incidence relative angles. The experimental setup, detection method and preliminary results from the field trials performed in the summer of 2013 in Sesimbra, Portugal and in La Spezia, Italy are also presented in this work.

2013

Groundtruth system for underwater benchmarking

Authors
Martins, A; Dias, A; Silva, H; Almeida, J; Goncalves, P; Lopes, F; Faria, A; Ribeiro, J; Silva, E;

Publication
2013 OCEANS - SAN DIEGO

Abstract
In this paper a vision based groundtruth system for underwater applications is presented. The proposed system as an external validation perception and localization mechanism for underwater trials in the INESC TEC/ISEP underwater robotics test tank. It is comprised by a stereo camera pair with external synchronization and a image processing and data recording host computer. The cameras are disposed in a rigid baseline calibrated using scenario key points. Two target detection algorithms were tested and their results are discussed. One is based on template matching techniques allowing the tracking of arbitrary targets without particular markers and the other on color segmentation with the target vehicle equipped with light markers. Also an example trajectory of a small ROV motion in the task is also presented.

2013

Multi-Robot Cooperative Stereo for Outdoor Scenarios

Authors
Dias, A; Almeida, J; Silva, E; Lima, P;

Publication
PROCEEDINGS OF THE 2013 13TH INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS (ROBOTICA)

Abstract
In this paper, we propose a cooperative perception framework for multi-robot real-time 3D high dynamic target estimation in outdoor scenarios based on monocular camera available on each robot. The relative position and orientation between robots establishes a flexible and dynamic stereo baseline. Overlap views subject to geometric constraints emerged from the stereo formulation, which allowed us to obtain a decentralized cooperative perception layer. Epipolar constraints related to the global frame are applied both in image feature matching and to feature searching and detection optimization in the image processing of robots with low computational capabilities. In contrast to classic stereo, the proposed framework considers all sources of uncertainty (in localization, attitude and image detection from both robots) in the determination of the objects best 3D localization and its uncertainty. The proposed framework can be later integrated in a decentralized data fusion (DDF) multi-target tracking approach where it can contribute to reduce rumor propagation data association and track initialization issues. We demonstrate the advantages of this approach in real outdoor scenario. This is done by comparing a stereo rigid baseline standalone target tracking with the proposed multi-robot cooperative stereo between a micro aerial vehicle (MAV) and an autonomous ground vehicle (AGV).

2017

PLineD: Vision-based Power Lines Detection for Unmanned Aerial Vehicles

Authors
Santos, T; Moreira, M; Almeida, J; Dias, A; Martins, A; Dinis, J; Formiga, J; Silva, E;

Publication
2017 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)

Abstract
It is commonly accepted that one of the most important factors for assuring the high performance of an electrical network is the surveillance and the respective preventive maintenance. From a long time ago that TSOs and DSOs incorporate in their maintenance plans the surveillance of the grid, where is included the aerial power lines inspection. Those inspections started by human patrol, including structure climbing when needed and later were substituted by helicopters with powerful sensors and specialised technicians. More recently the Unmanned Aerial Vehicles (UAV) technology has been used, taking advantage of its numerous advantages. This paper addresses the problem of improving the real-time perception capabilities of UAVs for endowing them with capabilities for safe and robust autonomous and semi-autonomous operations. It presents a new vision based power line detection algorithm denoted by PLineD, able to improve the detection robustness even in the presence of image with background noise. The algorithm is tested in real outdoor images of a dataset with multiple backgrounds and weather conditions. The experimental results demonstrate that the proposed approach is effective and able to implemented in real-time image processing pipeline.

2014

Simulation Environment for Multi-robot Cooperative 3D Target Perception

Authors
Dias, A; Almeida, J; Dias, N; Lima, P; Silva, E;

Publication
SIMULATION, MODELING, AND PROGRAMMING FOR AUTONOMOUS ROBOTS (SIMPAR 2014)

Abstract
Field experiments with a team of heterogeneous robots require human and hardware resources which cannot be implemented in a straightforward manner. Therefore, simulation environments are viewed by the robotic community as a powerful tool that can be used as an intermediate step to evaluate and validate the developments prior to their integration in real robots. This paper evaluates a novel multi-robot heterogeneous cooperative perception framework based on monocular measurements under the MORSE robotic simulation environment. The simulations are performed in an outdoor environment using a team of Micro Aerial Vehicles (MAV) and an Unmanned Ground Vehicle (UGV) performing distributed cooperative perception based on monocular measurements. The goal is to estimate the 3D target position.

  • 3
  • 19