Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by BIO

2022

Leveraging CMR for 3D echocardiography: an annotated multimodality dataset for AI

Authors
Zhao, D; Ferdian, E; Maso Talou, GD; Gilbert, K; Quill, GM; Wang, VY; Pedrosa, J; D'hooge, J; Sutton, T; Lowe, BS; Legget, ME; Ruygrok, PN; Doughty, RN; Young, AA; Nash, MP;

Publication
European Heart Journal - Cardiovascular Imaging

Abstract
Abstract Funding Acknowledgements: Type of funding sources: Public grant(s) – National budget only. Main funding source(s): Health Research Council of New Zealand (HRC) National Heart Foundation of New Zealand (NHF) Segmentation of the left ventricular myocardium and cavity in 3D echocardiography (3DE) is a critical task for the quantification of systolic function in heart disease. Continuing advances in 3DE have considerably improved image quality, prompting increased clinical uptake in recent years, particularly for volumetric measurements. Nevertheless, analysis of 3DE remains a difficult problem due to inherently complex noise characteristics, anisotropic image resolution, and regions of acoustic dropout. One of the primary challenges associated with the development of automated methods for 3DE analysis is the requirement of a sufficiently large training dataset. Historically, ground truth annotations have been difficult to obtain due to the high degree of inter- and intra-observer variability associated with manual 3DE segmentation, thus, limiting the scope of AI-based solutions. To address the lack of expert consensus, we instead used labels derived from cardiac magnetic resonance (CMR) images of the same subjects. By spatiotemporally registering CMR labels to corresponding 3DE image data on a per subject basis (Figure 1), we collated 520 annotated 3DE images from a mixed cohort of 130 human subjects (2 independent single-beat acquisitions per subject at end-diastole and end-systole) consisting of healthy controls and patients with acquired cardiac disease. Comprising images acquired across a range of patient demographics, this curated dataset exhibits variation in image quality, 3DE acquisition parameters, as well as left ventricular shape and pose within the 3D image volume. To demonstrate the utility of such a dataset, nn-UNet, a self-configuring deep learning method for semantic segmentation was employed. An 80/20 split of the dataset was used for training and testing, respectively, and data augmentations were applied in the form of scaling, rotation, and reflection. The trained network was capable of reproducing measurements derived from CMR for end-diastolic volume, end-systolic volume, ejection fraction, and mass, while outperforming an expert human observer in terms of accuracy as well as scan-rescan reproducibility (Table I). As part of ongoing efforts to improve the accuracy and efficiency of 3DE analysis, we have leveraged the high resolution and signal-to-noise-ratio of CMR (relative to 3DE), to create a novel, publicly available benchmark dataset for developing and evaluating 3DE labelling methods. This approach not only significantly reduces the effects of observer-specific bias and variability in training data arising from conventional manual 3DE analysis methods, but also improves the agreement between cardiac indices derived from 3DE and CMR. Figure 1. Data annotation workflow Table I. Results

2022

SYN-MAD 2022: Competition on Face Morphing Attack Detection Based on Privacy-aware Synthetic Training Data

Authors
Huber, M; Boutros, F; Luu, AT; Raja, K; Ramachandra, R; Damer, N; Neto, PC; Goncalves, T; Sequeira, AF; Cardoso, JS; Tremoco, J; Lourenco, M; Serra, S; Cermeno, E; Ivanovska, M; Batagelj, B; Kronovsek, A; Peer, P; Struc, V;

Publication
2022 IEEE INTERNATIONAL JOINT CONFERENCE ON BIOMETRICS (IJCB)

Abstract
This paper presents a summary of the Competition on Face Morphing Attack Detection Based on Privacy-aware Synthetic Training Data (SYN-MAD) held at the 2022 International Joint Conference on Biometrics (IJCB 2022). The competition attracted a total of 12 participating teams, both from academia and industry and present in 11 different countries. In the end, seven valid submissions were submitted by the participating teams and evaluated by the organizers. The competition was held to present and attract solutions that deal with detecting face morphing attacks while protecting people's privacy for ethical and legal reasons. To ensure this, the training data was limited to synthetic data provided by the organizers. The submitted solutions presented innovations that led to outperforming the considered baseline in many experimental settings. The evaluation benchmark is now available at: https://github.com/marcohuber/SYN-MAD-2022.

2022

Automatic Segmentation of Monofilament Testing Sites in Plantar Images for Diabetic Foot Management

Authors
Costa, T; Coelho, L; Silva, MF;

Publication
BIOENGINEERING-BASEL

Abstract
Diabetic peripheral neuropathy is a major complication of diabetes mellitus, and it is the leading cause of foot ulceration and amputations. The Semmes-Weinstein monofilament examination (SWME) is a widely used, low-cost, evidence-based tool for predicting the prognosis of diabetic foot patients. The examination can be quick, but due to the high prevalence of the disease, many healthcare professionals can be assigned to this task several days per month. In an ongoing project, it is our objective to minimize the intervention of humans in the SWME by using an automated testing system relying on computer vision. In this paper we present the project's first part, constituting a system for automatically identifying the SWME testing sites from digital images. For this, we have created a database of plantar images and developed a segmentation system, based on image processing and deep learning-both of which are novelties. From the 9 testing sites, the system was able to correctly identify most 8 in more than 80% of the images, and 3 of the testing sites were correctly identified in more than 97.8% of the images.

2022

Novel 3D video action recognition deep learning approach for near real time epileptic seizure classification

Authors
Karacsony, T; Loesch-Biffar, AM; Vollmar, C; Remi, J; Noachtar, S; Cunha, JPS;

Publication
SCIENTIFIC REPORTS

Abstract
Seizure semiology is a well-established method to classify epileptic seizure types, but requires a significant amount of resources as long-term Video-EEG monitoring needs to be visually analyzed. Therefore, computer vision based diagnosis support tools are a promising approach. In this article, we utilize infrared (IR) and depth (3D) videos to show the feasibility of a 24/7 novel object and action recognition based deep learning (DL) monitoring system to differentiate between epileptic seizures in frontal lobe epilepsy (FLE), temporal lobe epilepsy (TLE) and non-epileptic events. Based on the largest 3Dvideo-EEG database in the world (115 seizures/+680,000 video-frames/427GB), we achieved a promising cross-subject validation f1-score of 0.833 +/- 0.061 for the 2 class (FLE vs. TLE) and 0.763 +/- 0.083 for the 3 class (FLE vs. TLE vs. non-epileptic) case, from 2 s samples, with an automated semi-specialized depth (Acc.95.65%) and Mask R-CNN (Acc.96.52%) based cropping pipeline to pre-process the videos, enabling a near-real-time seizure type detection and classification tool. Our results demonstrate the feasibility of our novel DL approach to support 24/7 epilepsy monitoring, outperforming all previously published methods.

2022

OCFR 2022: Competition on Occluded Face Recognition From Synthetically Generated Structure-Aware Occlusions

Authors
Neto, PC; Boutros, F; Pinto, JR; Damer, N; Sequeira, AF; Cardoso, JS; Bengherabi, M; Bousnat, A; Boucheta, S; Hebbadj, N; Erakin, ME; Demir, U; Ekenel, HK; Vidal, PBD; Menotti, D;

Publication
2022 IEEE INTERNATIONAL JOINT CONFERENCE ON BIOMETRICS (IJCB)

Abstract
This work summarizes the IJCB Occluded Face Recognition Competition 2022 (IJCB-OCFR-2022) embraced by the 2022 International Joint Conference on Biometrics (IJCB 2022). OCFR-2022 attracted a total of 3 participating teams, from academia. Eventually, six valid submissions were submitted and then evaluated by the organizers. The competition was held to address the challenge of face recognition in the presence of severe face occlusions. The participants were free to use any training data and the testing data was built by the organisers by synthetically occluding parts of the face images using a well-known dataset. The submitted solutions presented innovations and performed very competitively with the considered baseline. A major output of this competition is a challenging, realistic, and diverse, and publicly available occluded face recognition benchmark with well defined evaluation protocols.

2022

Data Matrix Based Low Cost Autonomous Detection of Medicine Packages

Authors
Lima, J; Rocha, C; Rocha, L; Costa, P;

Publication
APPLIED SCIENCES-BASEL

Abstract
Counterfeit medicine is still a crucial problem for healthcare systems, having a huge impact in worldwide health and economy. Medicine packages can be traced from the moment of their production until they are delivered to the costumers through the use of Data Matrix codes, unique identifiers that can validate their authenticity. Currently, many practitioners at hospital pharmacies have to manually scan such codes one by one, a very repetitive and burdensome task. In this paper, a system which can simultaneously scan multiple Data Matrix codes and autonomously introduce them into an authentication database is proposed for the Hospital Pharmacy of the Centro Hospitalar de Vila Nova de Gaia/Espinho, E.P.E. Relevant features are its low cost and its seamless integration in their infrastructure. The results of the experiments were encouraging, and with upgrades such as real-time feedback of the code's validation and increased robustness of the hardware system, it is expected that the system can be used as a real support to the pharmacists.

  • 12
  • 113