2024
Authors
Silva, P; Cunha, A; Macedo, N; Oliveira, JN;
Publication
RIGOROUS STATE-BASED METHODS, ABZ 2024
Abstract
Humans are good at understanding subjective or vague statements which, however, are hard to express in classical logic. Fuzzy logic is an evolution of classical logic that can cope with vague terms by handling degrees of truth and not just the crisp values true and false. Logic is the formal basis of computing, enabling the formal design of systems supported by tools such as model checkers and theorem provers.This paper shows how a model checker such as Alloy can evolve to handle both classical and fuzzy logic, enabling the specification of high-level quantitative relational models in the fuzzy domain. In particular, the paper showcases how QAlloy-F (a conservative, general-purpose quantitative extension to standard Alloy) can be used to tackle fuzzy problems, namely in the context of validating the design of fuzzy controllers. The evaluation of QAlloy-F against examples taken from various classes of fuzzy case studies shows the approach to be feasible.
2024
Authors
Oliveira, N;
Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
The R/G approach to the development of interfering programs was initiated by the pioneering work of Cliff Jones (1981) on a relational basis. R/G has been the subject of much research since then, most of it deviating from the original relational set-up. This paper looks at such early work from a historical perspective and shows how it can be approached and extended using state-of-the-art relational algebra. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
2024
Authors
Ferreira, LMM; Coelho, F; Pereira, J;
Publication
ACM COMPUTING SURVEYS
Abstract
While a significant number of databases are deployed in cloud environments, pushing part or all data storage and querying planes closer to their sources (i.e., to the edge) can provide advantages in latency, connectivity, privacy, energy, and scalability. This article dissects the advantages provided by databases in edge and fog environments by surveying application domains and discussing the key drivers for pushing database systems to the edge. At the same time, it also identifies the main challenges faced by developers in this new environment and analyzes the mechanisms employed to deal with them. By providing an overview of the current state of edge and fog databases, this survey provides valuable insights into future research directions.
2024
Authors
Ramos, M; Azevedo, J; Kingsbury, K; Pereira, J; Esteves, T; Macedo, R; Paulo, J;
Publication
PROCEEDINGS OF THE VLDB ENDOWMENT
Abstract
We present LAZYFS, a new fault injection tool that simplifies the debugging and reproduction of complex data durability bugs experienced by databases, key-value stores, and other data-centric systems in crashes. Our tool simulates persistence properties of POSIX file systems (e.g., operations ordering and atomicity) and enables users to inject lost and torn write faults with a precise and controlled approach. Further, it provides profiling information about the system's operations flow and persisted data, enabling users to better understand the root cause of errors. We use LAZYFS to study seven important systems: PostgreSQL, etcd, Zookeeper, Redis, LevelDB, PebblesDB, and Lightning Network. Our fault injection campaign shows that LAZYFS automates and facilitates the reproduction of five known bug reports containing manual and complex reproducibility steps. Further, it aids in understanding and reproducing seven ambiguous bugs reported by users. Finally, LAZYFS is used to find eight new bugs, which lead to data loss, corruption, and unavailability.
2024
Authors
da Conceiçao, EL; Alonso, AN; Oliveira, RC; Pereira, J;
Publication
SCIENCE OF COMPUTER PROGRAMMING
Abstract
TADA is a unique toolkit designed to foster the use and implementation of approximate distributed agreement primitives. Developed in Java, TADA provides ready-to-use implementations of several approximate agreement algorithms, as well as the tools to enable programmers/researchers to easily implement further protocols: A template that enables new protocol implementations to be created by simply changing specific functions; and high-level abstractions for communication and concurrency control. As an example, the toolkit includes a ready-to-use implementation for clock synchronisation between distributed processes. Further use cases can include sensor input stabilisation and distributed machine learning, or other instances of distributed agreement where network synchrony cannot be assumed, byzantine fault tolerance may be required and a bounded divergence in decision values can be tolerated.
2024
Authors
Sequeira, A; Santos, LP; Barbosa, LS;
Publication
IEEE TRANSACTIONS ON QUANTUM ENGINEERING
Abstract
This article delves into the role of the quantum Fisher information matrix (FIM) in enhancing the performance of parameterized quantum circuit (PQC)-based reinforcement learning agents. While previous studies have highlighted the effectiveness of PQC-based policies preconditioned with the quantum FIM in contextual bandits, its impact in broader reinforcement learning contexts, such as Markov decision processes, is less clear. Through a detailed analysis of L & ouml;wner inequalities between quantum and classical FIMs, this study uncovers the nuanced distinctions and implications of using each type of FIM. Our results indicate that a PQC-based agent using the quantum FIM without additional insights typically incurs a larger approximation error and does not guarantee improved performance compared to the classical FIM. Empirical evaluations in classic control benchmarks suggest even though quantum FIM preconditioning outperforms standard gradient ascent, in general, it is not superior to classical FIM preconditioning.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.