2019
Authors
Ferreira, CA; Penas, S; Silva, J; Mendonca, AM;
Publication
2019 6TH IEEE PORTUGUESE MEETING IN BIOENGINEERING (ENBENG)
Abstract
Central serous chorioretinopathy is a retinal disease in which there is a leak of fluid into the subretinal space resulting in mild to moderate loss of visual acuity. Sequences of images from a fluorescein angiography exam are most of the times used for analyzing these leaks. This work presents a diagnostic aid method to detect and characterize the progression of fluid area along the exam, in order to provide a second opinion and increase the focus and the speed of analysis of the ophthalmologists. The method is based on a comparative approach by image subtraction between the late and early frames. The obtained segmentation results are quite promising with an average Dice coefficient of 0.801 +/- 0.106 for the training set and 0.774 +/- 0.106 for the test set.
2019
Authors
Wanderley, DS; Araujo, T; Carvalho, CB; Maia, C; Penas, S; Carneiro, A; Mendonca, AM; Campilho, A;
Publication
2019 6TH IEEE PORTUGUESE MEETING IN BIOENGINEERING (ENBENG)
Abstract
This study describes a novel dataset with retinal image quality annotation, defined by three different retinal experts, and presents an inter-observer analysis for quality assessment that can be used as gold-standard for future studies. A state-of-the-art algorithm for retinal image quality assessment is also analysed and compared against the specialists performance. Results show that, for 71% of the images present in the dataset, the three experts agree on the given image quality label. The results obtained for accuracy, specificity and sensitivity when comparing one expert against another were in the ranges [83.0 - 85.2]%, [72.7 - 92.9]% and [80.0 - 94.7]%, respectively. The evaluated automatic quality assessment method, despite not being trained on the novel dataset, presents a performance which is within inter-observer variability.
2019
Authors
Costa, P; Araujo, T; Aresta, G; Galdran, A; Mendonca, AM; Smailagic, A; Campilho, A;
Publication
PROCEEDINGS OF MVA 2019 16TH INTERNATIONAL CONFERENCE ON MACHINE VISION APPLICATIONS (MVA)
Abstract
Diabetic Retinopathy (DR) is one of the leading causes of preventable blindness in the developed world. With the increasing number of diabetic patients there is a growing need of an automated system for DR detection. We propose EyeWeS, a method that not only detects DR in eye fundus images but also pinpoints the regions of the image that contain lesions, while being trained with image labels only. We show that it is possible to convert any pre-trained convolutional neural network into a weakly-supervised model while increasing their performance and efficiency. EyeWeS improved the results of Inception V3 from 94:9% Area Under the Receiver Operating Curve (AUC) to 95:8% AUC while maintaining only approximately 5% of the Inception V3's number of parameters. The same model is able to achieve 97:1% AUC in a cross-dataset experiment.
2020
Authors
Rocha, J; Cunha, A; Mendonca, AM;
Publication
JOURNAL OF MEDICAL SYSTEMS
Abstract
Lung cancer is considered one of the deadliest diseases in the world. An early and accurate diagnosis aims to promote the detection and characterization of pulmonary nodules, which is of vital importance to increase the patients' survival rates. The mentioned characterization is done through a segmentation process, facing several challenges due to the diversity in nodular shape, size, and texture, as well as the presence of adjacent structures. This paper tackles pulmonary nodule segmentation in computed tomography scans proposing three distinct methodologies. First, a conventional approach which applies the Sliding Band Filter (SBF) to estimate the filter's support points, matching the border coordinates. The remaining approaches are Deep Learning based, using the U-Net and a novel network called SegU-Net to achieve the same goal. Their performance is compared, as this work aims to identify the most promising tool to improve nodule characterization. All methodologies used 2653 nodules from the LIDC database, achieving a Dice score of 0.663, 0.830, and 0.823 for the SBF, U-Net and SegU-Net respectively. This way, the U-Net based models yield more identical results to the ground truth reference annotated by specialists, thus being a more reliable approach for the proposed exercise. The novel network revealed similar scores to the U-Net, while at the same time reducing computational cost and improving memory efficiency. Consequently, such study may contribute to the possible implementation of this model in a decision support system, assisting the physicians in establishing a reliable diagnosis of lung pathologies based on this segmentation task.
2020
Authors
Rocha, J; Cunha, A; Mendonca, AM;
Publication
XV MEDITERRANEAN CONFERENCE ON MEDICAL AND BIOLOGICAL ENGINEERING AND COMPUTING - MEDICON 2019
Abstract
This paper proposes a conventional approach for pulmonary nodule segmentation, that uses the Sliding Band Filter to estimate the center of the nodule, and consequently the filter's support points, matching the initial border coordinates. This preliminary segmentation is then refined to try to include mainly the nodular area, and no other regions (e.g. vessels and pleural wall). The algorithm was tested on 2653 nodules from the LIDC database and achieved a Dice score of 0.663, yielding similar results to the ground truth reference, and thus being a promising tool to promote early lung cancer screening and improve nodule characterization.
2020
Authors
Porwal, P; Pachade, S; Kokare, M; Deshmukh, G; Son, J; Bae, W; Liu, LH; Wang, J; Liu, XH; Gao, LX; Wu, TB; Xiao, J; Wang, FY; Yin, BC; Wang, YZ; Danala, G; He, LS; Choi, YH; Lee, YC; Jung, SH; Li, ZY; Sui, XD; Wu, JY; Li, XL; Zhou, T; Toth, J; Bara, A; Kori, A; Chennamsetty, SS; Safwan, M; Alex, V; Lyu, XZ; Cheng, L; Chu, QH; Li, PC; Ji, X; Zhang, SY; Shen, YX; Dai, L; Saha, O; Sathish, R; Melo, T; Araujo, T; Harangi, B; Sheng, B; Fang, RG; Sheet, D; Hajdu, A; Zheng, YJ; Mendonca, AM; Zhang, ST; Campilho, A; Zheng, B; Shen, D; Giancardo, L; Quellec, G; Meriaudeau, F;
Publication
MEDICAL IMAGE ANALYSIS
Abstract
Diabetic Retinopathy (DR) is the most common cause of avoidable vision loss, predominantly affecting the working-age population across the globe. Screening for DR, coupled with timely consultation and treatment, is a globally trusted policy to avoid vision loss. However, implementation of DR screening programs is challenging due to the scarcity of medical professionals able to screen a growing global diabetic population at risk for DR. Computer-aided disease diagnosis in retinal image analysis could provide a sustainable approach for such large-scale screening effort. The recent scientific advances in computing capacity and machine learning approaches provide an avenue for biomedical scientists to reach this goal. Aiming to advance the state-of-the-art in automatic DR diagnosis, a grand challenge on "Diabetic Retinopathy - Segmentation and Grading" was organized in conjunction with the IEEE International Symposium on Biomedical Imaging (ISBI-2018). In this paper, we report the set-up and results of this challenge that is primarily based on Indian Diabetic Retinopathy Image Dataset (IDRiD). There were three principal subchallenges: lesion segmentation, disease severity grading, and localization of retinal landmarks and segmentation. These multiple tasks in this challenge allow to test the generalizability of algorithms, and this is what makes it different from existing ones. It received a positive response from the scientific community with 148 submissions from 495 registrations effectively entered in this challenge. This paper outlines the challenge, its organization, the dataset used, evaluation methods and results of top-performing participating solutions. The top-performing approaches utilized a blend of clinical information, data augmentation, and an ensemble of models. These findings have the potential to enable new developments in retinal image analysis and image-based DR screening in particular.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.