Details
Name
Vítor Santos CostaRole
Senior ResearcherSince
01st January 2009
Nationality
PortugalCentre
Advanced Computing SystemsContacts
+351220402963
vitor.s.costa@inesctec.pt
2024
Authors
Moreno, P; Areias, M; Rocha, R; Costa, VS;
Publication
INTERNATIONAL JOURNAL OF PARALLEL PROGRAMMING
Abstract
Prolog systems rely on an atom table for symbol management, which is usually implemented as a dynamically resizeable hash table. This is ideal for single threaded execution, but can become a bottleneck in a multi-threaded scenario. In this work, we replace the original atom table implementation in the YAP Prolog system with a lock-free hash-based data structure, named Lock-free Hash Tries (LFHT), in order to provide efficient and scalable symbol management. Being lock-free, the new implementation also provides better guarantees, namely, immunity to priority inversion, to deadlocks and to livelocks. Performance results show that the new lock-free LFHT implementation has better results in single threaded execution and much better scalability than the original lock based dynamically resizing hash table.
2024
Authors
Rocha, FM; Dutra, I; Costa, VS;
Publication
CoRR
Abstract
2024
Authors
Barbosa, J; Florido, M; Costa, VS;
Publication
CoRR
Abstract
2023
Authors
Machado, D; Costa, VS; Brandão, P;
Publication
Proceedings of the 16th International Joint Conference on Biomedical Engineering Systems and Technologies, BIOSTEC 2023, Volume 5: HEALTHINF, Lisbon, Portugal, February 16-18, 2023.
Abstract
2022
Authors
Guimaraes, V; Costa, VS;
Publication
INDUCTIVE LOGIC PROGRAMMING (ILP 2021)
Abstract
In this paper, we present two online structure learning algorithms for NeuralLog, NeuralLog+OSLR and NeuralLog+OMIL. NeuralLog is a system that compiles first-order logic programs into neural networks. Both learning algorithms are based on Online Structure Learner by Revision (OSLR). NeuralLog+OSLR is a port of OSLR to use NeuralLog as inference engine; while NeuralLog+OMIL uses the underlying mechanism from OSLR, but with a revision operator based on Meta-Interpretive Learning. We compared both systems with OSLR and RDN-Boost on link prediction in three different datasets: Cora, UMLS and UWCSE. Our experiments showed that NeuralLog+OMIL outperforms both the compared systems on three of the four target relations from the Cora dataset and in the UMLS dataset, while both NeuralLog+OSLR and NeuralLog+OMIL outperform OSLR and RDNBoost on the UWCSE, assuming a good initial theory is provided.
Supervised Thesis
2023
Author
Christopher David Harrison
Institution
UP-FCUP
2023
Author
Christopher David Harrison
Institution
UP-FCUP
2023
Author
Diogo Roberto de Melo e Diogo Machado
Institution
UP-FCUP
2023
Author
Filipe Emanuel dos Santos Marinho da Rocha
Institution
UP-FCUP
2023
Author
João Luis Alves Barbosa
Institution
UP-FCUP
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.