2025
Authors
Montenegro, H; Cardoso, MJ; Cardoso, JS;
Publication
CoRR
Abstract
2025
Authors
Ferreira, P; Zolfagharnasab, MH; Goncalves, T; Bonci, E; Mavioso, C; Cardoso, J; Cardoso, S;
Publication
IEEE Portuguese Meeting on Bioengineering, ENBENG
Abstract
This study presents an explainable content-based image retrieval system for predicting post-surgical aesthetic outcomes in breast cancer patients, comparing state-of-theart vision transformers, convolutional neural networks, and B-cos architectures. Results show that vision transformers, particularly GC ViT and DaViT, outperform convolutional neural networks and B-cos architectures, achieving an adjusted discounted cumulative gain of up to 80.18%. This superior performance is attributed to their ability to model long-range dependencies while effectively capturing local information. Bcos networks underperform (64.28-70.19% adjusted discounted cumulative gain), likely due to oversimplified feature alignment unsuitable for clinical tasks. Explainability analysis using Integrated Gradients reveals that models primarily focus on breast regions but occasionally attend to irrelevant features (e.g., arm positioning, leading to retrieval errors and highlighting a semantic gap between learned visual similarities and clinical relevance. Future work aims to integrate anatomical segmentation and ensemble learning methods to enhance clinical alignment and address attention inaccuracies. Clinical Relevance-The content-based image retrieval system developed in this study aids clinicians by supporting surgical outcome prediction in breast cancer patients and streamlining the traditionally time-intensive task of manually identifying similar reference images for patient consultation. © 2025 IEEE.
2025
Authors
Proaño-Guevara D.; Lobo A.; Oliveira C.; Costa C.I.; Fontes-Carvalho R.; da Silva H.P.; Renna F.;
Publication
Computing in Cardiology
Abstract
We introduce a multimodal Signal Quality Indicator (SQI) for assessing fidelity of synchronous electrocardiogram (ECG) and phonocardiogram (PCG) signals recorded in ambulatory, non-standardized settings. The method uses a bidirectional fiducial-matching algorithm to test the temporal alignment of QRS complexes and T waves (ECG) with S1 and S2 sounds (PCG) respectively. Validation employed 564 synchronous ECG–PCG pairs collected with the FDA-cleared Rijuven Cardiosleeve at the aortic, pulmonary, tricuspid, and mitral valves sites. Expert annotations served as ground truth. In a three-class task, the SQI reached an area under the ROC curve greater than 79%, showing strong discriminative power. This physiology-based metric supports batch-online monitoring and reliable quality control of opportunistic cardiac recordings.
2025
Authors
Purificato, E; Boratto, L; Vinagre, J;
Publication
UMAP (Adjunct Publication)
Abstract
2025
Authors
Silva, R; Ramos, G; Salimi, F;
Publication
SN Computer Science
Abstract
The main goal of this paper was to develop, implement, and test a practical framework for large-scale last-mile delivery problems that employ a combination of optimisation and machine learning while focussing on different routing methods. Delivery companies in big cities choose delivery orders based on the tacit knowledge of experienced drivers, since solving a large optimisation model with several variables is not a practical solution to meet their daily needs. This framework includes three phases of districting, sequencing, and routing, and in total 30 different variants were tested in different capacities. Using the power of machine learning, a model is trained and tuned to predict driving road distances, allowing the implementation of the whole framework and improving performance from analysing 2983 stops in several hours to 58,192 stops in less than 15 minutes. The results demonstrated that Inter 1 - Centroids is the best inter-district connection method, and one of the best variants in this framework is variant 26 which managed to decrease up to 34,77% total distances with 79 fewer drivers in a full month analysis compared to the original routes of the delivery company. © The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd. 2025.
2025
Authors
Rodrigues Ferraz Esteves, AR; Campos Magalhães, EM; Bernardes De Almeida, G;
Publication
SAE Technical Papers
Abstract
Silent motors are an excellent strategy to combat noise pollution. Still, they can pose risks for pedestrians who rely on auditory cues for safety and reduce driver awareness due to the absence of the familiar sounds of combustion engines. Sound design for silent motors not only tackles the above issues but goes beyond safety standards towards a user-centered approach by considering how users perceive and interpret sounds. This paper examines the evolving field of sound design for electric vehicles (EVs), focusing on Acoustic Vehicle Alerting Systems (AVAS). The study analyzes existing AVAS, classifying them into different groups according to their design characteristics, from technical concerns and approaches to aesthetic properties. Based on the proposed classification, an (adaptive) sound design methodology, and concept for AVAS are proposed based on state-of-the-art technologies and tools (APIs), like Wwise Automotive, and integration through a functional prototype within a virtual environment. We validate our solution by conducting user tests focusing on EV sound perception and preferences in rural and urban environments. Results showed participants preferred nature-like and melodic sounds with a wide range of frequencies, emphasizing 1000Hz, in rural areas, for the AVAS. For the interior experience, melodic, reliable, and relaxing sounds with a frequency range from 200Hz to 500Hz. In urban areas, melodic, futuristic, but not overpowering sounds (80Hz to 700Hz) with balanced frequencies at high speeds were chosen for the car's exterior. In the interior, melodic, futuristic, and combustion engine-like sounds with a low frequencies background and higher frequencies at high speeds were also preferred. © 2025 SAE International. All Rights Reserved.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.