2023
Authors
Freitas, N; Silva, D; Mavioso, C; Cardoso, MJ; Cardoso, JS;
Publication
BIOENGINEERING-BASEL
Abstract
Breast cancer conservative treatment (BCCT) is a form of treatment commonly used for patients with early breast cancer. This procedure consists of removing the cancer and a small margin of surrounding tissue, while leaving the healthy tissue intact. In recent years, this procedure has become increasingly common due to identical survival rates and better cosmetic outcomes than other alternatives. Although significant research has been conducted on BCCT, there is no gold-standard for evaluating the aesthetic results of the treatment. Recent works have proposed the automatic classification of cosmetic results based on breast features extracted from digital photographs. The computation of most of these features requires the representation of the breast contour, which becomes key to the aesthetic evaluation of BCCT. State-of-the-art methods use conventional image processing tools that automatically detect breast contours based on the shortest path applied to the Sobel filter result in a 2D digital photograph of the patient. However, because the Sobel filter is a general edge detector, it treats edges indistinguishably, i.e., it detects too many edges that are not relevant to breast contour detection and too few weak breast contours. In this paper, we propose an improvement to this method that replaces the Sobel filter with a novel neural network solution to improve breast contour detection based on the shortest path. The proposed solution learns effective representations for the edges between the breasts and the torso wall. We obtain state of the art results on a dataset that was used for developing previous models. Furthermore, we tested these models on a new dataset that contains more variable photographs and show that this new approach shows better generalization capabilities as the previously developed deep models do not perform so well when faced with a different dataset for testing. The main contribution of this paper is to further improve the capabilities of models that perform the objective classification of BCCT aesthetic results automatically by improving upon the current standard technique for detecting breast contours in digital photographs. To that end, the models introduced are simple to train and test on new datasets which makes this approach easily reproducible.
2023
Authors
Silva, D; Agrotis, G; Tan, RB; Teixeira, LF; Silva, W;
Publication
International Conference on Machine Learning and Applications, ICMLA 2023, Jacksonville, FL, USA, December 15-17, 2023
Abstract
Deep Learning models are tremendously valuable in several prediction tasks, and their use in the medical field is spreading abruptly, especially in computer vision tasks, evaluating the content in X-rays, CTs or MRIs. These methods can save a significant amount of time for doctors in patient diagnostics and help in treatment planning. However, these models are significantly sensitive to confounders in the training data and generally suffer a performance hit when dealing with out-of-distribution data, affecting their reliability and scalability in different medical institutions. Deep Learning research on Medical datasets may overlook essential details regarding the image acquisition procedure and the preprocessing steps. This work proposes a data-centric approach, exploring the potential of attention maps as a regularisation technique to improve robustness and generalisation. We use image metadata and explore self-attention maps and contrastive learning to promote feature space invariance to image disturbance. Experiments were conducted using Chest X-ray datasets that are publicly available. Some datasets contained information about the windowing settings applied by the radiologist, acting as a source of variability. The proposed model was tested and outperformed the baseline in out-of-distribution data, serving as a proof of concept. © 2023 IEEE.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.