2022
Autores
Geros, AF; Cruz, R; de Chaumont, F; Cardoso, JS; Aguiar, P;
Publicação
Abstract
2022
Autores
Gonçalves, T; Torto, IR; Teixeira, LF; Cardoso, JS;
Publicação
CoRR
Abstract
2022
Autores
Rio-Torto, I; Campanico, AT; Pinho, P; Filipe, V; Teixeira, LF;
Publicação
APPLIED SCIENCES-BASEL
Abstract
The still prevalent use of paper conformity lists in the automotive industry has a serious negative impact on the performance of quality control inspectors. We propose instead a hybrid quality inspection system, where we combine automated detection with human feedback, to increase worker performance by reducing mental and physical fatigue, and the adaptability and responsiveness of the assembly line to change. The system integrates the hierarchical automatic detection of the non-conforming vehicle parts and information visualization on a wearable device to present the results to the factory worker and obtain human confirmation. Besides designing a novel 3D vehicle generator to create a digital representation of the non conformity list and to collect automatically annotated training data, we apply and aggregate in a novel way state-of-the-art domain adaptation and pseudo labeling methods to our real application scenario, in order to bridge the gap between the labeled data generated by the vehicle generator and the real unlabeled data collected on the factory floor. This methodology allows us to obtain, without any manual annotation of the real dataset, an example-based F1 score of 0.565 in an unconstrained scenario and 0.601 in a fixed camera setup (improvements of 11 and 14.6 percentage points, respectively, over a baseline trained with purely simulated data). Feedback obtained from factory workers highlighted the usefulness of the proposed solution, and showed that a truly hybrid assembly line, where machine and human work in symbiosis, increases both efficiency and accuracy in automotive quality control.
2022
Autores
Marques, M; Lourenco, CD; Teixeira, LF;
Publicação
PATTERN RECOGNITION AND IMAGE ANALYSIS (IBPRIA 2022)
Abstract
The automation of interictal epileptiform discharges through deep learning models can increase assertiveness and reduce the time spent on epilepsy diagnosis, making the process faster and more reliable. It was demonstrated that deep sequence networks can be a useful type of algorithm to effectively detect IEDs. Several different deep networks were tested, of which the best three architectures reached average AUC values of 0.96, 0.95 and 0.94, with convergence of test specificity and sensitivity values around 90%, which indicates a good ability to detect IED samples in EEG records.
2022
Autores
Pinho, AJ; Georgieva, P; Teixeira, LF; Sánchez, JA;
Publicação
IbPRIA
Abstract
2022
Autores
Rodrigues, ASF; Lopes, JC; Lopes, RP; Teixeira, LF;
Publicação
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, OL2A 2022
Abstract
Facial expressions are one of the most common way to externalize our emotions. However, the same emotion can have different effects on the same person and has different effects on different people. Based on this, we developed a system capable of detecting the facial expressions of a person in real-time, occluding the eyes (simulating the use of virtual reality glasses). To estimate the position of the eyes, in order to occlude them, Multi-task Cascade Convolutional Neural Networks (MTCNN) were used. A residual network, a VGG, and the combination of both models, were used to perform the classification of 7 different types of facial expressions (Angry, Disgust, Fear, Happy, Sad, Surprise, Neutral), classifying the occluded and non-occluded dataset. The combination of both models, achieved an accuracy of 64.9% for the occlusion dataset and 62.8% for no occlusion, using the FER-2013 dataset. The primary goal of this work was to evaluate the influence of occlusion, and the results show that the majority of the classification is done with the mouth and chin. Nevertheless, the results were far from the state-of-the-art, which is expect to be improved, mainly by adjusting the MTCNN.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.