2016
Autores
Rocha, T; Fernandes, H; Paredes, H; Barroso, J;
Publicação
Universal Access in Human-Computer Interaction: Users and Context Diversity, Pt III
Abstract
In this paper a 3D map solution combined with a mobile phone application is presented. This solution enables blind users to perceive their spatial location from tactile stimulation, but also contextual information from a mobile application that provides this information via mobile phone, using audio. In the proposed model, 3d map sections embedding NFC technology support the application scenario described in this work.
2016
Autores
Correia, A; Fonseca, B; Paredes, H; Martins, P; Morgado, L;
Publicação
Progress in IS - Handbook on 3D3C Platforms
Abstract
2015
Autores
Sousa, A; Barroso, J; Paredes, H; Fernandes, H; Filipe, V;
Publicação
PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON SOFTWARE DEVELOPMENT AND TECHNOLOGIES FOR ENHANCING ACCESSIBILITY AND FIGHTING INFO-EXCLUSION
Abstract
Technology entered in our lives and changed not only the way we communicate and interact with each other, but also our habits and the experiences in the real and digital worlds. However, due to the rapid progress, we use technology in every moment of our day and sometimes this causes some frustration because the way we interact with the applications is not the most effective for the context we are in. This problem is even more significant in the business environments, where effectively the time we take to finish some kind of task can mean profit or loss for the business. The key to these problems can be in the adaptation of the interface to user needs and constrains as it happens in solutions for situational induced impairment and disabilities (SIID). This can be made by inference the context in which the user it is by using different sensors available on mobile platform and different sources of information such as user profile, agenda and usage history. In this paper we propose a review of the main challenges of the dynamic adaptation of interfaces, with a case of application in a business environment. (c) 2015 The Authors. Published by Elsevier B.V.
2016
Autores
Reis, A; Lains, J; Paredes, H; Filipe, V; Abrantes, C; Ferreira, F; Mendes, R; Amorim, P; Barroso, J;
Publicação
Universal Access in Human-Computer Interaction: Users and Context Diversity, Pt III
Abstract
Stroke episodes are a major health issue worldwide for which most patients require an initial period of special rehabilitation and functional treatment, involving medical doctors and specialized therapists, followed by ambulatory physiotherapy exercise. In this second period most do not fulfil the prescribed recovery plan, resulting in setbacks in their recovery. This paper reports on the design of a methodology to develop a system to support the ambulatory rehabilitation therapy, providing constant feedback to the clinicians, by means of an information system platform, and maintaining the patient motivation by using an exergames approach to design and deliver the therapy exercises to the patient.
2016
Autores
Alves Fernandes, LMA; Matos, GC; Azevedo, D; Nunes, RR; Paredes, H; Morgado, L; Barbosa, LF; Martins, P; Fonseca, B; Cristovao, P; de Carvalho, F; Cardoso, B;
Publicação
BEHAVIOUR & INFORMATION TECHNOLOGY
Abstract
Gestural interaction devices emerged and originated various studies on multimodal human-computer interaction to improve user experience (UX). However, there is a knowledge gap regarding the use of these devices to enhance learning. We present an exploratory study which analysed the UX with a multimodal immersive videogame prototype, based on a Portuguese historical/cultural episode. Evaluation tests took place in high school environments and public videogaming events. Two users would be present simultaneously in the same virtual reality (VR) environment: one as the helmsman aboard Vasco da Gama's fifteenth-century Portuguese ship and the other as the mythical Adamastor stone giant at the Cape of Good Hope. The helmsman player wore a VR headset to explore the environment, whereas the giant player used body motion to control the giant, and observed results on a screen, with no headset. This allowed a preliminary characterisation of UX, identifying challenges and potential use of these devices in multi-user virtual learning contexts. We also discuss the combined use of such devices, towards future development of similar systems, and its implications on learning improvement through multimodal human-computer interaction.
2015
Autores
Paredes, H; Fernandes, H; Sousa, A; Fernandes, L; Koch, F; Fortes, R; Filipe, V; Barroso, J;
Publicação
Communications in Computer and Information Science
Abstract
In this paper the orchestration of wearable sensors with human computation is explored to provide map metadata for blind navigation. Technological navigation aids for blind must provide accurate information about the environment and select the best path to reach a chosen destination. Urban barriers represent dangers for the blind users. The dynamism of smart cities promotes a constant change of these dangers and therefore a potentially “dangerous territory” for these users. Previous work demonstrated that redundant solutions in smart environments complemented by human computation could provide a reliable and trustful data source for a new generation of blind navigation systems. We propose and discuss a modular architecture, which interacts with environmental sensors to gather information and process the acquired data with advanced algorithms empowered by human computation. The gathered metadata should enable the creation of “happy maps” that are delivered to blind users through a previously developed navigation system. © Springer International Publishing Switzerland 2015.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.