2016
Authors
Melo, M; Rocha, T; Barbosa, L; Bessa, M;
Publication
2016 23RD PORTUGUESE MEETING ON COMPUTER GRAPHICS AND INTERACTION (EPCGI)
Abstract
The sense of presence and cybersickness are keyfactors to have into account when referring to Virtual Environments (VE). To achieve high levels of presence and minimize cybersickness, it is important to ensure that the user's stimulation is coherent with the contents that are being delivered. In this paper, it is presented a pilot study addressing the usage of both objective and subjective metrics to measure the sense of presence and cybersickness in VE in order to study possible correlations between these two evaluation approaches. On top of that, the pilot study includes two body positions to allow evaluating if the stimulation of the vestibular study has impact on the sense of presence and cybersickness. To evaluate presence and cybersickness, it was developed a VE that consists in a hill where participants ride a bicycle. To broaden the scope of the study, there were studied two body positions: standing and sitting on the bicycle. The equipment EMOTIV epoc+ was used to register the objective metrics. The subjective metrics were registered using the Igroup Presence Questionnaire (IPQ) and the Simulator Sickness Questionnaire (SSQ). To complement the collected data, the levels of fatigue and stress before and after the experience were also registered through self-evaluation. Results show that objective metrics Interest and Stress and the subjective metrics Realism, Fatigue and Stress have a positive interaction regarding the sense of presence. Results further suggest that there is a positive interaction between the objective metric Focus and the subjective metric Involvement.
2015
Authors
Melo, M; Bessa, M; Barbosa, L; Debattista, K; Chalmers, A;
Publication
EURASIP JOURNAL ON IMAGE AND VIDEO PROCESSING
Abstract
This paper presents an evaluation of high-dynamic-range (HDR) video tone mapping on a small screen device (SSD) under reflections. Reflections are common on mobile devices as these devices are predominantly used on the go. With this evaluation, we study the impact of reflections on the screen and how different HDR video tone mapping operators (TMOs) perform under reflective conditions as well as understand if there is a need to develop a new or hybrid TMO that can deal with reflections better. Two well-known HDR video TMOs were evaluated in order to test their performance with and without on-screen reflections. Ninety participants were asked to rank the TMOs for a number of tone-mapped HDR video sequences on an SSD against a reference HDR display. The results show that the greater the area exposed to reflections, the larger the negative impact on a TMO's perceptual accuracy. The results also show that under observed conditions, when reflections are present, the hybrid TMOs do not perform better than the standard TMOs.
2015
Authors
Morgado, L; Cardoso, B; de Carvalho, F; Fernandes, L; Paredes, H; Barbosa, L; Fonseca, B; Martins, P; Nunes, RR;
Publication
CIT/IUCC/DASC/PICOM 2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY - UBIQUITOUS COMPUTING AND COMMUNICATIONS - DEPENDABLE, AUTONOMIC AND SECURE COMPUTING - PERVASIVE INTELLIGENCE AND COMPUTING
Abstract
Gesture-controlled applications typically are tied to specific gestures, and also tied to specific recognition methods and specific gesture-detection devices. We propose a concern-separation architecture, which mediates the following concerns: gesture acquisition; gesture recognition; and gestural control. It enables application developers to respond to gesture-independent commands, recognized using plug-in gesture-recognition modules that process gesture data via both device-dependent and device-independent data formats and callbacks. Its feasibility is demonstrated with a sample implementation.
2013
Authors
Meira, C; Freitas, J; Barbosa, L; Melo, M; Bessa, M; Magalhaes, L;
Publication
PROCEEDINGS OF THE 2013 8TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI 2013)
Abstract
Virtual Environments (VE) systems may provide a new way to deliver information and services in many areas, for example in tourism, urban planning and education. In urban VE there is a close link between the virtual environment and the urban environment that are intended to represent. These VE can be an intuitive way to access a set of services with a direct association to the real object or entity to which they are related. In this article, we describe a case study that aimed at exploring the possibility of using new interfaces to exploit and use services in urban VE with a greater sense of immersiveness. The results indicate that the VE interfaces are a natural and intuitive access to digital services. While users have felt a greater difficulty in performing some of the tasks in immersive scenario, the majority considered that this scenario provided a greater sense of immersion and realism.
2017
Authors
Melo, M; Barbosa, L; Bessa, M; Debattista, K; Chalmers, A;
Publication
MULTIMEDIA TOOLS AND APPLICATIONS
Abstract
HDR video on mobile devices is in its infancy and there are no solutions yet that can achieve full HDR video reproduction due to computational power limitations. In this paper we present a novel and versatile solution that allows the delivery of HDR video on mobile devices by taking into account contextual information and retro-compatibility for devices that do not have the computational power to decode HDR video. The proposed solution also enables the remote transmission of HDR video to mobile devices in real-time. This context-aware HDR video distribution solution for mobile devices is evaluated and discussed by considering the impact of HDR videos over conventional low dynamic range videos on mobile devices as well as the challenge of playing HDR videos directly locally or remotely.
2018
Authors
Barreira, J; Bessa, M; Barbosa, L; Magalhaes, L;
Publication
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS
Abstract
Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.