Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por Pavel Brazdil

2005

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): Preface

Autores
Jorge, A; Torgo, L; Brazdil, P; Camacho, R; Gama, J;

Publicação
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract

2023

AfriSenti: A Twitter Sentiment Analysis Benchmark for African Languages

Autores
Muhammad, SH; Abdulmumin, I; Ayele, AA; Ousidhoum, N; Adelani, DI; Yimam, SM; Ahmad, IS; Beloucif, M; Mohammad, S; Ruder, S; Hourrane, O; Brazdil, P; António Ali, FDM; David, D; Osei, S; Bello, BS; Ibrahim, F; Gwadabe, T; Rutunda, S; Belay, TD; Messelle, WB; Balcha, HB; Chala, SA; Gebremichael, HT; Opoku, B; Arthur, S;

Publicação
CoRR

Abstract

2018

Incremental Sparse TFIDF & Incremental Similarity with Bipartite Graphs

Autores
Sarmento, RP; Brazdil, P;

Publicação
CoRR

Abstract

2022

Contextualization for the Organization of Text Documents Streams

Autores
Sarmento, RP; Cardoso, DdO; Gama, J; Brazdil, P;

Publicação
CoRR

Abstract

2018

Dynamic Laplace: Efficient Centrality Measure for Weighted or Unweighted Evolving Networks

Autores
Cordeiro, M; Sarmento, RP; Brazdil, P; Gama, J;

Publicação
CoRR

Abstract

2023

Exploring the Reduction of Configuration Spaces of Workflows

Autores
Freitas, F; Brazdil, P; Soares, C;

Publicação
Discovery Science - 26th International Conference, DS 2023, Porto, Portugal, October 9-11, 2023, Proceedings

Abstract
Many current AutoML platforms include a very large space of alternatives (the configuration space) that make it difficult to identify the best alternative for a given dataset. In this paper we explore a method that can reduce a large configuration space to a significantly smaller one and so help to reduce the search time for the potentially best workflow. We empirically validate the method on a set of workflows that include four ML algorithms (SVM, RF, LogR and LD) with different sets of hyperparameters. Our results show that it is possible to reduce the given space by more than one order of magnitude, from a few thousands to tens of workflows, while the risk that the best workflow is eliminated is nearly zero. The system after reduction is about one order of magnitude faster than the original one, but still maintains the same predictive accuracy and loss. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.

  • 20
  • 22