2020
Authors
Castro, AL; de Souza, JPC; Rocha, LF; Silva, MF;
Publication
INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION
Abstract
Purpose This paper aims to propose an automated framework for agile development and simulation of robotic palletizing cells. An automatic offline programming tool, for a variety of robot brands, is also introduced. Design/methodology/approach This framework, named AdaptPack Studio, offers a custom-built library to assemble virtual models of palletizing cells, quick connect these models by drag and drop, and perform offline programming of robots and factory equipment in short steps. Findings Simulation and real tests performed showed an improvement in the design, development and operation of robotic palletizing systems. The AdaptPack Studio software was tested and evaluated in a pure simulation case and in a real-world scenario. Results have shown to be concise and accurate, with minor model displacement inaccuracies because of differences between the virtual and real models. Research limitations/implications An intuitive drag and drop layout modeling accelerates the design and setup of robotic palletizing cells and automatic offline generation of robot programs. Furthermore, A* based algorithms generate collision-free trajectories, discretized both in the robot joints space and in the Cartesian space. As a consequence, industrial solutions are available for production in record time, increasing the competitiveness of companies using this tool. Originality/value The AdaptPack Studio framework includes, on a single package, the possibility to program, simulate and generate the robot code for four different brands of robots. Furthermore, the application is tailored for palletizing applications and specifically includes the components (Building Blocks) of a particular company, which allows a very fast development of new solutions. Furthermore, with the inclusion of the Trajectory Planner, it is possible to automatically develop robot trajectories without collisions.
2021
Authors
de Souza, JPC; Rocha, LF; Oliveira, PM; Moreira, AP; Boaventura Cunha, J;
Publication
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING
Abstract
The robotic grasping task persists as a modern industry problem that seeks autonomous, fast implementation, and efficient techniques. Domestic robots are also a reality demanding a delicate and accurate human-machine interaction, with precise robotic grasping and handling. From decades ago, with analytical heuristics, to recent days, with the new deep learning policies, grasping in complex scenarios is still the aim of several works' that propose distinctive approaches. In this context, this paper aims to cover recent methodologies' development and discuss them, showing state-of-the-art challenges and the gap to industrial applications deployment. Given the complexity of the related issue associated with the elaborated proposed methods, this paper formulates some fair and transparent definitions for results' assessment to provide researchers with a clear and standardised idea of the comparison between the new proposals.
2021
Authors
Tinoco, V; Silva, MF; Santos, FN; Rocha, LF; Magalhaes, S; Santos, LC;
Publication
2021 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
The increase of the world population and a decrease in agricultural labour availability have motivated research robotics in the agricultural field. This paper aims to analyze the state of the art related to manipulators used in the agricultural robotics field. Two pruning and seven harvesting manipulators were reviewed and are analyzed. The pruning manipulators were used in two different scenarios: (i) grapevines and (ii) apple trees. These manipulators showed that a light-controlled environment could reduce visual errors and that prismatic joints on the manipulator are advantageous to obtain a higher reach. The harvesting manipulators were used for 5 different products: (i) strawberries, (ii) tomatoes, (iii) apples, (iv) sweet-peppers and (v) iceberg lettuce. The harvesting manipulators showed that a different kinematic configuration is required for different end-effectors, as some end-effectors only require horizontal movements and others require more degrees of freedom to reach and grasp the target. This work will support new developments of novel solutions related to agricultural robotic grasping and manipulation.
2021
Authors
de Souza, JPC; Rocha, LF; Filipe, VM; Boaventura Cunha, J; Moreira, AP;
Publication
2021 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
Nowadays, the robotic welding joint estimation, or weld seam tracking, has improved according to the new developments on computer vision technologies. Typically, the advances are focused on solving inaccurate procedures that advent from the manual positioning of the metal parts in welding workstations, especially in SMEs. Robotic arms, endowed with the appropriate perception capabilities, are a viable solution in this context, aiming for enhancing the production system agility whilst not increasing the production set-up time and costs. In this regard, this paper proposes a local perception pipeline to estimate joint welding points using small-sized/low-cost 3D cameras, following an eyes-on-hand approach. A metrological 3D camera comparison between Intel Realsene D435, D415, and ZED Mini is also discussed, proving that the proposed pipeline associated with standard commercial 3D cameras is viable for welding operations in an industrial environment.
2021
Authors
Moutinho, D; Rebelo, P; Costa, C; Rocha, L; Veiga, G;
Publication
2021 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
This paper presents a collaborative mobile manipulator assembly station, which uses force control to surpass the positional uncertainties arising from unstructured work environments and positional errors of the mobile platform. For this purpose, the use case of an internal combustion engine for the automotive industry was considered. Several force control heuristics relying on blind searches using oscillations and/or environment exploration were developed and implemented. Particular attention was given to the orientation errors of the mobile platform, as it was proved that they have a significant impact on the assembly task. The proposed heuristics showed great potential for the use case at hand. Particularly, when the orientation error of the platform is limited to +/- 2 degrees, the oscillation method complemented by environment exploration was able to surpass a maximum translation error of 32.3mm, whereas the environment exploration complemented by orientation correction was able to surpass an error of 73.3mm. Moreover, a generalization strategy was proposed, intending to expand the scope of the developed heuristics to other assembly applications.
2021
Authors
Santos, J; Rebelo, PM; Rocha, LF; Costa, P; Veiga, G;
Publication
ROBOTICS
Abstract
A multi-AGV based logistic system is typically associated with two fundamental problems, critical for its overall performance: the AGV's route planning for collision and deadlock avoidance; and the task scheduling to determine which vehicle should transport which load. Several heuristic functions can be used according to the application. This paper proposes a time-based algorithm to dynamically control a fleet of Autonomous Guided Vehicles (AGVs) in an automatic warehouse scenario. Our approach includes a routing algorithm based on the A* heuristic search (TEA*-Time Enhanced A*) to generate free-collisions paths and a scheduling module to improve the results of the routing algorithm. These modules work cooperatively to provide an efficient task execution time considering as basis the routing algorithm information. Simulation experiments are presented using a typical industrial layout for 10 and 20 AGVs. Moreover, a comparison with an alternative approach from the state-of-the-art is also presented.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.