2018
Autores
Queirós, R;
Publicação
7th Symposium on Languages, Applications and Technologies, SLATE 2018, June 21-22, 2018, Guimaraes, Portugal
Abstract
The JavaScript ecosystem is evolving dramatically. Nowadays, the language is no longer confined to the boundaries of the browser and is now running in both sides of the Web stack. At the same time, JavaScript it’s starting to play also an important role in desktop and mobile applications development. These facts are leading companies to massively adopt JavaScript in their Web/mobile projects and schools to augment the language spectrum among their courses curricula. Several platforms appeared in recent years aiming to foster the learning of the JavaScript language. Those platforms are mainly characterized with sophisticated UI which allow users to learn JavaScript in a playful and interactive way. Despite its apparent success, these environments are not suitable to be integrated in existent educational platforms. Beyond these interoperability issues, most of these platforms are rigid not allowing teachers to contribute with new exercises, organize the existent exercises in more suitable and modular activities to be deployed in their courses, neither keep track of student’s progress. This paper presents LearnJS as a simple and flexible platform to teach and learn JavaScript. In this platform, instructors can contribute with new exercises and combine them with expositive resources (e.g videos) to define specific course activities. These activities can be gamified with the injection of dynamic attributes to reward the most successful attempts. Finally, instructors can deploy activities in their educational platforms. On the other hand, learners can solve exercises and receive immediate feedback on their solutions through static and dynamic analyzers. Since we are in the early stages of implementation, the paper focus on the presentation of the LearnJS architecture, their main components and their data and integration models. Nevertheless, a prototype of the platform is available in a GitHub repository. © Ricardo Queirós
2018
Autores
Cunha, A; Macedo, N;
Publicação
Abstract State Machines, Alloy, B, TLA, VDM, and Z - 6th International Conference, ABZ 2018, Southampton, UK, June 5-8, 2018, Proceedings
Abstract
This paper reports on the development of a formal model for the Hybrid ERTMS/ETCS Level 3 concept in Electrum, a lightweight formal specification language that extends Alloy with mutable relations and temporal logic operators. We show how Electrum and its Analyzer can be used to perform scenario exploration to validate this model, namely to check that all the example operational scenarios described in the reference document are admissible, and to reason about expected safety properties, which can be easily specified and model checked for arbitrary track configurations. The Analyzer depicts scenarios (and counter-examples) in a graphical notation that is logic-agnostic, making them understandable for stakeholders without expertise in formal specification. © Springer International Publishing AG, part of Springer Nature 2018.
2018
Autores
Matos, T; Nobrega, R; Rodrigues, R; Pinheiro, M;
Publicação
WEB3D 2018: THE 23RD INTERNATIONAL ACM CONFERENCE ON 3D WEB TECHNOLOGY
Abstract
The use of 360 degrees videos has been increasing steadily in the 2010s, as content creators and users search for more immersive experiences. The freedom to choose where to look at during the video may hinder the overall experience instead of enhancing it, as there is no guarantee that the user will focus on relevant sections of the scene. Visual annotations superimposed on the video, such as text boxes or arrow icons, can help guide the user through the narrative of the video while maintaining freedom of movement. This paper presents a web-based immersive visualizer for 360. videos that contain dynamic media annotations, rendered in real-time. A set of annotations was created with the purpose of providing information or guiding the user to points of interest. The visualizer can be used with a computer, using a keyboard and mouse or HTC Vive, and in mobile devices with Cardboard VR headsets, to experience the video in virtual reality, which is made possible with the WebVR API. The visualizer was evaluated through usability tests, to analyze the impact of different annotation techniques on the users' experience. The obtained results demonstrate that annotations can assist in guiding the user during the video, and a careful design is imperative so that they are not intrusive and distracting for the viewers.
2018
Autores
Garbajosa, J; Wang, X; Aguiar, A;
Publicação
Lecture Notes in Business Information Processing
Abstract
2018
Autores
Fares, A; Gama, J; Campos, P;
Publicação
Studies in Big Data - Learning from Data Streams in Evolving Environments
Abstract
2018
Autores
Aguiar, A; Wagner, S; Hoda, R;
Publicação
ACM International Conference Proceeding Series
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.