2018
Authors
Rodrigues, S; Paiva, JS; Dias, D; Aleixo, M; Filipe, R; Cunha, JPS;
Publication
Open Bioinformatics Journal
Abstract
Background: Air Traffic Control (ATC) is a complex and demanding process, exposing Air Traffic Controllers (ATCs) to high stress. Recently, efforts have been made in ATC to maintain safety and efficiency in the face of increasing air traffic demands. Computer simulations have been a useful tool for ATC training, improving ATCs skills and consequently traffic safety. Objectives: This study aims to: a) evaluate psychophysiological indices of stress in an ATC simulation environment using a wearable biomonitoring platform. In order to obtain a measure of ATCs stress levels, results from an experimental study with the same participants, that included a stress-induced task were used as a stress ground truth; b) understand if there are differences in stress levels of ATCs with different job functions (“advisors” vs “operationals”) when performing an ATC Refresher Training, in a simulator environment. Methods: Two studies were conducted with ATCs: Study 1, that included a stress-induced task - the Trier Social Stress Test (TSST) and Study 2, that included an ATC simulation task. Linear Heart Rate Variability (HRV) features from ATCs were acquired using a medical grade wearable Electrocardiogram (ECG) device. Self-reports were used to measure perceived stress. Results: TSST was self-reported as being much more stressful than the simulation task. Physiological data supports these results. Results from study 2 showed more stress among the “advisors” group when comparing to the “operational” group. Conclusion: Results point to the importance of the development of quantified Occupational Health (qOHealth) devices to allow monitoring and differentiation of ATCs stress responses. © 2018 Donato and Denaro.
2018
Authors
Rodrigues, SM; Paiva, JS; Ribeiro, RSR; Soppera, O; Cunha, JPS; Jorge, PAS;
Publication
SENSORS
Abstract
Optical fiber tweezers have been gaining prominence in several applications in Biology and Medicine. Due to their outstanding focusing abilities, they are able to trap and manipulate microparticles, including cells, needing any physical contact and with a low degree of invasiveness to the trapped cell. Recently, we proposed a fiber tweezer configuration based on a polymeric micro-lens on the top of a single mode fiber, obtained by a self-guided photopolymerization process. This configuration is able to both trap and identify the target through the analysis of short-term portions of the back-scattered signal. In this paper, we propose a variant of this fabrication method, capable of producing more robust fiber tips, which produce stronger trapping effects on targets by as much as two to ten fold. These novel lenses maintain the capability of distinguish the different classes of trapped particles based on the back-scattered signal. This novel fabrication method consists in the introduction of a multi mode fiber section on the tip of a single mode (SM) fiber. A detailed description of how relevant fabrication parameters such as the length of the multi mode section and the photopolymerization laser power can be tuned for different purposes (e.g., microparticles trapping only, simultaneous trapping and sensing) is also provided, based on both experimental and theoretical evidences.
2018
Authors
Hartl, E; Knoche, T; Choupina, HMP; Remi, J; Vollmar, C; Cunha, JPS; Noachtar, S;
Publication
SEIZURE-EUROPEAN JOURNAL OF EPILEPSY
Abstract
Purpose: To investigate the frequency, localizing significance, and intensity characteristics of ictal vocalization in different focal epilepsy syndromes. Methods: Up to four consecutive focal seizures were evaluated in 277 patients with lesional focal epilepsy, excluding isolated auras and subclinical EEG seizure patterns. Vocalization was considered to be present if observed in at least one of the analyzed seizures and not being of speech quality. Intensity features of ictal vocalization were analyzed in a subsample of 17 patients with temporal and 19 with extratemporal epilepsy syndrome. Results: Ictal vocalization was observed in 37% of the patients (102/277) with similar frequency amongst different focal epilepsy syndromes. Localizing significance was found for its co-occurrence with ictal automatisms, which identified patients with temporal seizure onset with a sensitivity of 92% and specificity of 70%. Quantitative analysis of vocalization intensity allowed to distinguish seizures of frontal from temporal lobe origin based on the intensity range (p = 0.0003), intensity variation (p < 0.0001), as well as the intensity increase rate at the beginning of the vocalization (p = 0.003), which were significantly higher in frontal lobe seizures. No significant difference was found for mean intensity and mean vocalization duration. Conclusions: Although ictal vocalization is similarly common in different focal epilepsies, it shows localizing significance when taken into account the co-occurring seizure semiology. It especially increases the localizing value of automatisms, predicting a temporal seizure onset with a sensitivity of 92% and specificity of 70%. Quantitative parameters of the intensity dynamic objectively distinguished frontal lobe seizures, establishing an observer independent tool for semiological seizure evaluation.
2018
Authors
Rocha, AP; Pereira Choupina, HMP; Vilas Boas, MD; Fernandes, JM; Silva Cunha, JPS;
Publication
PLOS ONE
Abstract
Human gait analysis provides valuable information regarding the way of walking of a given subject. Low-cost RGB-D cameras, such as the Microsoft Kinect, are able to estimate the 3-D position of several body joints without requiring the use of markers. This 3-D information can be used to perform objective gait analysis in an affordable. portable, and non-intrusive way. In this contribution, we present a system for fully automatic gait analysis using a single RGB-D camera, namely the second version of the Kinect. Our system does not require any manual intervention (except for starting/stopping the data acquisition), since it firstly recognizes whether the subject is walking or not, and identifies the different gait cycles only when walking is detected. For each gait cycle, it then computes several gait parameters, which can provide useful information in various contexts, such as sports, healthcare, and biometric identification. The activity recognition is performed by a predictive model that distinguishes between three activities (walking, standing and marching), and between two postures of the subject (facing the sensor, and facing away from it). The model was built using a multilayer perceptron algorithm and several measures extracted from 3-D joint data, achieving an overall accuracy and F-1 score of 98%. For gait cycle detection, we implemented an algorithm that estimates the instants corresponding to left and right heel strikes, relying on the distance between ankles, and the velocity of left and right ankles. The algorithm achieved errors for heel strike instant and stride duration estimation of 15 +/- 25 ms and 1 +/- 29 ms (walking towards the sensor), and 12 +/- 23 ms and 2 +/- 24 ms (walking away from the sensor ) Our gait cycle detection solution can be used with any other RGB-D camera that provides the 3-D position of the main body joints.
2018
Authors
Dias, D; Cunha, JPS;
Publication
SENSORS
Abstract
Wearable Health Devices (WHDs) are increasingly helping people to better monitor their health status both at an activity/fitness level for self-health tracking and at a medical level providing more data to clinicians with a potential for earlier diagnostic and guidance of treatment. The technology revolution in the miniaturization of electronic devices is enabling to design more reliable and adaptable wearables, contributing for a world-wide change in the health monitoring approach. In this paper we review important aspects in the WHDs area, listing the state-of-the-art of wearable vital signs sensing technologies plus their system architectures and specifications. A focus on vital signs acquired by WHDs is made: first a discussion about the most important vital signs for health assessment using WHDs is presented and then for each vital sign a description is made concerning its origin and effect on heath, monitoring needs, acquisition methods and WHDs and recent scientific developments on the area (electrocardiogram, heart rate, blood pressure, respiration rate, blood oxygen saturation, blood glucose, skin perspiration, capnography, body temperature, motion evaluation, cardiac implantable devices and ambient parameters). A general WHDs system architecture is presented based on the state-of-the-art. After a global review of WHDs, we zoom in into cardiovascular WHDs, analysing commercial devices and their applicability versus quality, extending this subject to smart t-shirts for medical purposes. Furthermore we present a resumed evolution of these devices based on the prototypes developed along the years. Finally we discuss likely market trends and future challenges for the emerging WHDs area.
2018
Authors
Ribeiro, RT; Silva Cunha, JPS;
Publication
BIOMEDICAL SIGNAL PROCESSING AND CONTROL
Abstract
In this work we propose a regression approach based on separability maximization (RASMa) for modeling a continuous-valued estimate of the stress level (we called it stress index) using some features extracted from electrocardiogram (ECG) data. Since no objective measure of the actual stress level (output) is available, finding the stress index cannot be addressed as a classical regression problem. Instead, the proposed approach finds the linear combination of features that maximizes the separability of stress index values for non-stress and stress events. In short, RASMa combines linear discriminant analysis with the Bhattacharyya distance, embedded in a leave-one-subject-out cross-validation scheme. A 26-case pilot study using 17 heart rate variability (HRV) features was conducted as a proof of concept. A near real-time application tool for monitoring stress level over time was also implemented based on the model obtained from the pilot study.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.