Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by Pavel Brazdil

2005

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): Preface

Authors
Jorge, A; Torgo, L; Brazdil, P; Camacho, R; Gama, J;

Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract

2023

AfriSenti: A Twitter Sentiment Analysis Benchmark for African Languages

Authors
Muhammad, SH; Abdulmumin, I; Ayele, AA; Ousidhoum, N; Adelani, DI; Yimam, SM; Ahmad, IS; Beloucif, M; Mohammad, S; Ruder, S; Hourrane, O; Brazdil, P; António Ali, FDM; David, D; Osei, S; Bello, BS; Ibrahim, F; Gwadabe, T; Rutunda, S; Belay, TD; Messelle, WB; Balcha, HB; Chala, SA; Gebremichael, HT; Opoku, B; Arthur, S;

Publication
CoRR

Abstract

2018

Incremental Sparse TFIDF & Incremental Similarity with Bipartite Graphs

Authors
Sarmento, RP; Brazdil, P;

Publication
CoRR

Abstract

2022

Contextualization for the Organization of Text Documents Streams

Authors
Sarmento, RP; Cardoso, DdO; Gama, J; Brazdil, P;

Publication
CoRR

Abstract

2018

Dynamic Laplace: Efficient Centrality Measure for Weighted or Unweighted Evolving Networks

Authors
Cordeiro, M; Sarmento, RP; Brazdil, P; Gama, J;

Publication
CoRR

Abstract

2023

Exploring the Reduction of Configuration Spaces of Workflows

Authors
Freitas, F; Brazdil, P; Soares, C;

Publication
Discovery Science - 26th International Conference, DS 2023, Porto, Portugal, October 9-11, 2023, Proceedings

Abstract
Many current AutoML platforms include a very large space of alternatives (the configuration space) that make it difficult to identify the best alternative for a given dataset. In this paper we explore a method that can reduce a large configuration space to a significantly smaller one and so help to reduce the search time for the potentially best workflow. We empirically validate the method on a set of workflows that include four ML algorithms (SVM, RF, LogR and LD) with different sets of hyperparameters. Our results show that it is possible to reduce the given space by more than one order of magnitude, from a few thousands to tens of workflows, while the risk that the best workflow is eliminated is nearly zero. The system after reduction is about one order of magnitude faster than the original one, but still maintains the same predictive accuracy and loss. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.

  • 20
  • 22