2019
Authors
Braun, J; Brito, T; Lima, J; Costa, P; Costa, P; Nakano, A;
Publication
SIMULTECH 2019 - Proceedings of the 9th International Conference on Simulation and Modeling Methodologies, Technologies and Applications
Abstract
There is an increasing number of mobile robot applications. The demanding of the Industry 4.0 pushes the robotic areas in the direction of the decision. The autonomous robots should actually decide the path according to the dynamic environment. In some cases, time requirements must also be attended and require fast path planning methods. This paper addresses a comparison between well-known path planning methods using a realistic simulator that handles the dynamic properties of robot models including sensors. The methodology is implemented in SimTwo that allows to compare the A* and RRT* algorithms in different scenarios with dynamic and real time constraint scenarios. Copyright
2020
Authors
Braun, J; Fernandes, LA; Moya, T; Oliveira, V; Brito, T; Lima, J; Costa, P;
Publication
FOURTH IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, ROBOT 2019, VOL 1
Abstract
Teaching based on challenges and competitions is one of the most exciting and promising methods for students. In this paper, a competition of the Portuguese Robotics Open is addressed and a solution is proposed. The Robot@Factory Lite is a new challenge and accepts participants from secondary schools (Rookie) and universities. The concepts of simulation, hardware-in-the-loop and timed finite state machine are presented and validated in the real robot prototype. The aim of this paper is to disseminate the developed solution in order to attract more students to STEM educational program.
2020
Authors
Brito, T; Lima, J; Costa, P; Matellan, V; Braun, J;
Publication
FOURTH IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, ROBOT 2019, VOL 1
Abstract
The collaboration between humans and machines, where humans can share the same work environment without safety equipment due to the collision avoidance characteristic is one of the research topics for the Industry 4.0. This work proposes a system that acquires the space of the environment through an RGB-Depth sensor, verifies the free spaces in the created Point Cloud and executes the trajectory of the collaborative manipulator avoiding collisions. It is demonstrated a simulated environment before the system in real situations, in which the movements of pick-and-place tasks are defined, diverting from virtual obstacles with the RGB-Depth sensor. It is possible to apply this system in real situations with obstacles and humans, due to the results obtained in the simulation. The basic structure of the system is supported by the ROS software, in particular, the MoveIt! and Rviz. These tools serve both for simulations and for real applications. The obtained results allow to validate the system using the algorithms PRM and RRT, chosen for being commonly used in the field of robot path planning.
2020
Authors
Braun, J; Piardi, L; Brito, T; Lima, J; Pereira, A; Costa, P; Nakano, A;
Publication
FOURTH IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, ROBOT 2019, VOL 2
Abstract
Inspection based on mobile autonomous robots can assume an important role in many industries. Instead of having fixed sensors, the concept of assembling the sensors on a mobile robot that performs the scanning and inspection through a defined path is cheaper, configurable and adaptable. This paper describes a mobile robot, equipped with several gas sensors and a LIDAR device, that scans an established area by following a trajectory based on way-points searching for gas leakage and simultaneously avoid obstacles in the map. In other words, the robot follows the trajectory while the gas concentration is under a defined value and surrounding the obstacles. Otherwise, the autonomous robot starts the leakage search based on a search algorithm that allows to find the leakage position. The proposed methodology is verified in simulation based on a model of the real robot. The search test performed in a simulation environment allows to validate the proposed methodology.
2021
Authors
Braun J.; Lima J.; Pereira A.I.; Rocha C.; Costa P.;
Publication
Communications in Computer and Information Science
Abstract
The recent growth in the use of 3D printers by independent users has contributed to a rise in interest in 3D scanners. Current 3D scanning solutions are commonly expensive due to the inherent complexity of the process. A previously proposed low-cost scanner disregarded uncertainties intrinsic to the system, associated with the measurements, such as angles and offsets. This work considers an approach to estimate these optimal values that minimize the error during the acquisition. The Particle Swarm Optimization algorithm was used to obtain the parameters to optimally fit the final point cloud to the surfaces. Three tests were performed where the Particle Swarm Optimization successfully converged to zero, generating the optimal parameters, validating the proposed methodology.
2021
Authors
Braun, J; Lima, J; Pereira, AI; Rocha, C; Costa, P;
Publication
2021 26TH IEEE INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION (ETFA)
Abstract
Nowadays, with the availability of 3D printers, the scanners for objects are becoming increasingly present since they allow to replicate objects by 3D printing, especially for small scale sizes. However, the majority of these technologies are expensive, due to the complexity of this task. Therefore, this work presents a prototype of a low-cost 3D scanning system for small objects using a point cloud to stereolithography approach where it was already validated in simulation in previous work. This concept has a restriction that the objects must have a uniform shape, i.e, without discontinuities. The architecture is composed of two stepper motors, due to their precision, a rotating plate to allow 360 degrees scans and another rotating structure that allows the infrared distance sensor to scan the object from bottom to top (90 degrees). The prototype was validated in the real scenario with good results.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.