2015
Autores
Santos, J; Costa, P; Rocha, LF; Moreira, AP; Veiga, G;
Publicação
2015 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT)
Abstract
In this paper the authors focus on presenting a new path planning approach for a multi-robot transportation system in an industrial case scenario. The proposed method is based on the A* heuristic search in a cell decomposition scenario, for which a time component was added - Time Enhanced A* or simply TEA*. To access the flexibility and efficiency of the proposed algorithm, a set of experiments were performed in a simulated industrial environment. During trials execution the proposed algorithm has shown high capability on preventing/dealing with the occurrence of deadlocks in the transportation system.
2016
Autores
Santos, J; Costa, P; Rocha, L; Vivaldini, K; Paulo Moreira, AP; Veiga, G;
Publicação
ROBOT 2015: SECOND IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 2
Abstract
Traffic Control is one of the fundamental problems in the management of an Automated Guided Vehicle (AGV) system. Its main objectives are to assure efficient conflict free routes and to avoid/solve system deadlocks. In this sense, and as an extension of our previouswork, this paper focus on exploring the capabilities of the Time Enhanced A* (TEA*) to dynamically control a fleet of AGVs, responsible for the execution of a predetermined set of tasks, considering an automatic warehouse case scenario. During the trial execution the proposed algorithm, besides having shown high capability on preventing/dealing with the occurrence of deadlocks, it also has exhibited high efficiency in the generation of free collision trajectories. Moreover, it was also selected an alternative from the state-of-art, in order to validate the TEA* results and compare it.
2014
Autores
Rocha, LF; Veiga, G; Ferreira, M; Paulo Moreira, AP; Santos, V;
Publicação
2014 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
Nowadays, entering in the highly competitive international market becomes a key strategy for the survive and sustained growth of enterprises in the Portuguese textile and footwear industrial sector. Thereby, to face new requirements, companies need to understand that technological innovation is a key issue. In this scenario, the research presented in this paper focuses on the development of a robot based conveyor line pick-and-place solution to perform an automatic collection of the shoe last. The solution developed consists of extracting the 3D model of the shoe last suport transported in the conveyor line and aligning it, using the Iterative Closest Point (ICP) algorithm, with a template model previously recorded. The Camera-Laser triangulation system was the approach selected to extract the 3D model. With the correct position and orientation estimation of the conveyor footwear, it will make possible to execute the pick-and-place task using an industrial manipulator. The practical implication of this work is that it contributes to improve the footwear production lines efficiency, in order to meet demands in shorter periods of time, and with high quality standards. This work was developed in partnership with the Portuguese company CEI by ZIPOR.
2014
Autores
Ferreira, M; Costa, P; Rocha, L; Paulo Moreira, AP; Pires, N;
Publicação
2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)
Abstract
This paper presents a new marker for robot programming by demonstration through motion imitation. The device is based on high intensity LEDs (light emission diodes) which are captured by a pair of industrial cameras. Using stereoscopy, the marker supplies 6-DoF (degrees of freedom) human wrist tracking with both position and orientation data. We propose a robust technique for camera and stereo calibration which maps camera coordinates directly into the desired robot frame, using a single LED. The calibration and tracking procedures are thoroughly described. The tests show that the marker presents a new robust, accurate and intuitive method for industrial robot programming. The system is able to perform in real-time and requires only a single pair of industrial cameras though more can be used for improved effectiveness and accuracy.
2014
Autores
Rocha, LF; Ferreira, M; Santos, V; Moreira, AP;
Publicação
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING
Abstract
The research work presented in this paper focuses on the development of a 3D object localization and recognition system to be used in robotics conveyor coating lines. These requirements were specified together with enterprises with small production series seeking a full robotic automation of their production line that is characterized by a wide range of products in simultaneous manufacturing. Their production process (for example heat or coating/painting treatments) limits the use of conventional identification systems attached to the object in hand. Furthermore, the mechanical structure of the conveyor introduces geometric inaccuracy in the object positioning. With the correct classification and localization of the object, the robot will be able to autonomously select the right program to execute and to perform coordinate system corrections. A cascade system performed with Support Vector Machine and the Perfect Match (point cloud geometric template matching) algorithms was developed for this purpose achieving 99.5% of accuracy. The entire recognition and pose estimation procedure is performed in a maximum time range of 3 s with standard off the shelf hardware. It is expected that this work contributes to the integration of industrial robots in highly dynamic and specialized production lines.
2013
Autores
Pinto, AM; Rocha, LF; Paulo Moreira, AP;
Publicação
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING
Abstract
In recent years, computer vision has been widely used on industrial environments, allowing robots to perform important tasks like quality control, inspection and recognition. Vision systems are typically used to determine the position and orientation of objects in the workstation, enabling them to be transported and assembled by a robotic cell (e.g. industrial manipulator). These systems commonly resort to CCD (Charge-Coupled Device) Cameras fixed and located in a particular work area or attached directly to the robotic arm (eye-in-hand vision system). Although it is a valid approach, the performance of these vision systems is directly influenced by the industrial environment lighting. Taking all these into consideration, a new approach is proposed for eye-on-hand systems, where the use of cameras will be replaced by the 2D Laser Range Finder (LRF). The LRF will be attached to a robotic manipulator, which executes a pre-defined path to produce grayscale images of the workstation. With this technique the environment lighting interference is minimized resulting in a more reliable and robust computer vision system. After the grayscale image is created, this work focuses on the recognition and classification of different objects using inherent features (based on the invariant moments of Hu) with the most well-known machine learning models: k-Nearest Neighbor (kNN), Neural Networks (NNs) and Support Vector Machines (SVMs). In order to achieve a good performance for each classification model, a wrapper method is used to select one good subset of features, as well as an assessment model technique called K-fold cross-validation to adjust the parameters of the classifiers. The performance of the models is also compared, achieving performances of 83.5% for kNN, 95.5% for the NN and 98.9% for the SVM (generalized accuracy). These high performances are related with the feature selection algorithm based on the simulated annealing heuristic, and the model assessment (k-fold cross-validation). It makes possible to identify the most important features in the recognition process, as well as the adjustment of the best parameters for the machine learning models, increasing the classification ratio of the work objects present in the robot's environment.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.