2013
Authors
Neto, P; Moreira, AP;
Publication
Communications in Computer and Information Science
Abstract
2013
Authors
Neto, P; Moreira, AP;
Publication
Communications in Computer and Information Science
Abstract
2016
Authors
Mendes, N; Neto, P; Safeea, M; Moreira, AP;
Publication
ROBOT 2015: SECOND IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 2
Abstract
A solution for intuitive robot command and fast robot programming is presented to assemble pins in car doors. Static and dynamic gestures are used to instruct an industrial robot in the execution of the assembly task. An artificial neural network (ANN) was used in the recognition of twelve static gestures and a hidden Markov model (HMM) architecture was used in the recognition of ten dynamic gestures. Results of these two architectures are compared with results displayed by a third architecture based on support vector machine (SVM). Results show recognition rates of 96 % and 94 % for static and dynamic gestures when the ANN and HMM architectures are used, respectively. The SVM architecture presents better results achieving recognition rates of 97 % and 96 % for static and dynamic gestures, respectively.
2013
Authors
Neto, P; Pereira, D; Pires, JN; Moreira, AP;
Publication
2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)
Abstract
New and more natural human-robot interfaces are of crucial interest to the evolution of robotics. This paper addresses continuous and real-time hand gesture spotting, i.e., gesture segmentation plus gesture recognition. Gesture patterns are recognized by using artificial neural networks (ANNs) specifically adapted to the process of controlling an industrial robot. Since in continuous gesture recognition the communicative gestures appear intermittently with the non-communicative, we are proposing a new architecture with two ANNs in series to recognize both kinds of gesture. A data glove is used as interface technology. Experimental results demonstrated that the proposed solution presents high recognition rates (over 99% for a library of ten gestures and over 96% for a library of thirty gestures), low training and learning time and a good capacity to generalize from particular situations.
2016
Authors
Neto, P; Paulo Moreira, AP;
Publication
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY
Abstract
2016
Authors
Ferreira, M; Costa, P; Rocha, L; Paulo Moreira, AP;
Publication
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY
Abstract
This contribution presents a new system for fast and intuitive industrial robot reprogramming. It is based on a luminous marker built with high-intensity LEDs, which are captured by a set of industrial cameras. Using stereoscopy, the marker supplies 6-DoF human wrist tracking with both position and orientation data. This marker can be efficiently attached to any working tool which then provides a way to capture human skills without further intrusion in the tasks. The acquisition technique makes the tracking very robust against lighting conditions so no environment preparation is needed. The robot is automatically programmed from the demonstrated task which delivers complete abstraction of programming concepts. The system is able to perform in real time, and is low-cost starting with a single pair of industrial cameras though more can be used for improved effectiveness and accuracy. The real-time feature means that the robot is ready to perform as soon as the demonstration is over which carries no overhead of reprogramming times. Also, there is no interference with the task itself since the marker is attached to the work tool and the tracking is contactless; the human operator can then perform naturally. The test bed is a real industrial environment: a spray painting application. A prototype has been developed and installed, and is currently in operation. The tests show that the proposed system enables transferring to the machine the human ability of manipulating a spray gun.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.