2016
Authors
Sousa, A; Mendes, P; Sousa, L; Salavessa, E;
Publication
REHABEND
Abstract
2016
Authors
Reis, LP; Moreira, AP; Lima, PU; Montano, L; Muñoz Martinez, V;
Publication
Advances in Intelligent Systems and Computing
Abstract
2016
Authors
Reis, LP; Moreira, AP; Lima, PU; Montano, L; Muñoz Martínez, VF;
Publication
ROBOT (1)
Abstract
2016
Authors
Paulo Garrido; Filomena Soares; António Paulo Moreira;
Publication
Abstract
2016
Authors
Mendes, J; dos Santos, FN; Ferraz, N; Couto, P; Morais, R;
Publication
2016 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC 2016)
Abstract
Develop ground robots for crop monitoring and harvesting in steep slope vineyards is a complex challenge due to two main reasons: harsh condition of the terrain and unstable localization accuracy got from Global Positioning Systems (GPS). For this context, a reliable localization system requires a high density of natural/artificial features and an accurate detector. This paper presents a novel visual detector for Vineyards Trunks and Masts (ViTruDe). The ViTruDe detector was developed considering the constrains of a cost-effective robot to carry-out crop monitoring tasks in steep slope vineyard environment. The obtained results with real data shows an accuracy higher than 95% for all tested configurations. The training and test data are made public for future research work. This approach is a contribution for an accurate and reliable localization system that is GPS-free.
2016
Authors
Benrachou, DE; dos Santos, FN; Boulebtateche, B; Bensaoula, S;
Publication
2016 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC 2016)
Abstract
Humans are increasingly cooperating with machinery/robots in a high number of domains and under uncontrolled conditions. When persons are interacting with machinery, they are exposed to distraction/fatigue, which can lead to dangerous situations. The evaluation of individual's attention and fatigue levels is highly needed in such situations. This is an important measurement to avoid the interaction of humans with the machine when these levels of concentration are critical. This paper proposes a real-time vision-based approach for eye localization and head motion estimation (EyeLHM). The proposed method is evaluated under three different databases: GI4E face database, extended Yale-B database and GI4E head pose database. High detection rates are achieved on GI4E head-pose database and face database, 97.35% and 87.19% respectively. EyeLHM approach is optimized to be deployed in low-cost computers, such as RaspberryPi or UDOO Boards.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.