Details
Name
Carlos GuedesRole
External Research CollaboratorSince
05th November 2007
Nationality
PortugalCentre
Telecommunications and MultimediaContacts
+351222094299
carlos.guedes@inesctec.pt
2018
Authors
Sioros, G; Davies, MEP; Guedes, C;
Publication
JOURNAL OF NEW MUSIC RESEARCH
Abstract
We present a novel model for the characterization of musical rhythms that is based on the pervasive rhythmic phenomenon of syncopation. Syncopation is felt when the sensation of the regular beat or pulse in the music is momentarily disrupted; the feeling arises from breaking more expected patterns such as pickups (anacrusis) and faster events that introduce and bridge the notes articulated on the beats. Our model begins with a simple pattern that articulates a beat consistent with the metrical expectations of a listener. Any rhythm is then generated from a unique combination of transformations applied on that simple pattern. Each transformation introduces notes in off-beat positions as one of three basic characteristic elements: (1) syncopations, (2) pickup rhythmic figures and (3) faster notes that articulate a subdivision of the beat. The characterization of a pattern is based on an algorithm that discovers and reverses the transformations in a stepwise manner. We formalize the above transformations and present the characterization algorithm, and then demonstrate and discuss the model through the analysis of the main rhythmic pattern of the song Don't stop till you get enough' by Michael Jackson.
2017
Authors
Bernardes, G; Davies, MEP; Guedes, C;
Publication
2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)
Abstract
In this paper we present the INESC Key Detection (IKD) system which incorporates a novel method for dynamically biasing key mode estimation using the spatial displacement of beat-synchronous Tonal Interval Vectors (TIVs). We evaluate the performance of the IKD system at finding the global key on three annotated audio datasets and using three key-defining profiles. Results demonstrate the effectiveness of the mode bias in favoring either the major or minor mode, thus allowing users to fine tune this variable to improve correct key estimates on style-specific music datasets or to balance predictions across key modes on unknown input sources.
2017
Authors
Bernardes, G; Davies, MEP; Guedes, C;
Publication
Music Technology with Swing - 13th International Symposium, CMMR 2017, Matosinhos, Portugal, September 25-28, 2017, Revised Selected Papers
Abstract
We present a hierarchical harmonic mixing method for assisting users in the process of music mashup creation. Our main contributions are metrics for computing the harmonic compatibility between musical audio tracks at small- and large-scale structural levels, which combine and reassess existing perceptual relatedness (i.e., chroma vector similarity and key affinity) and dissonance-based approaches. Underpinning our harmonic compatibility metrics are harmonic indicators from the perceptually-motivated Tonal Interval Space, which we adapt to describe musical audio. An interactive visualization shows hierarchical harmonic compatibility viewpoints across all tracks in a large musical audio collection. An evaluation of our harmonic mixing method shows our adaption of the Tonal Interval Space robustly describes harmonic attributes of musical instrument sounds irrespective of timbral differences and demonstrates that the harmonic compatibility metrics comply with the principles embodied in Western tonal harmony to a greater extent than previous approaches. © 2018, Springer Nature Switzerland AG.
2016
Authors
Bernardes, G; Cocharro, D; Caetano, M; Guedes, C; Davies, MEP;
Publication
JOURNAL OF NEW MUSIC RESEARCH
Abstract
In this paper we present a 12-dimensional tonal space in the context of the Tonnetz, Chew's Spiral Array, and Harte's 6-dimensional Tonal Centroid Space. The proposed Tonal Interval Space is calculated as the weighted Discrete Fourier Transform of normalized 12-element chroma vectors, which we represent as six circles covering the set of all possible pitch intervals in the chroma space. By weighting the contribution of each circle (and hence pitch interval) independently, we can create a space in which angular and Euclidean distances among pitches, chords, and regions concur with music theory principles. Furthermore, the Euclidean distance of pitch configurations from the centre of the space acts as an indicator of consonance.
2016
Authors
Bernardes, G; Cocharro, D; Guedes, C; Davies, MEP;
Publication
Music, Mind, and Embodiment
Abstract
We present Conchord, a system for real-time automatic generation of musical harmony through navigation in a novel 12-dimensional Tonal Interval Space. In this tonal space, angular and Euclidean distances among vectors representing multi-level pitch configurations equate with music theory principles, and vector norms acts as an indicator of consonance. Building upon these attributes, users can intuitively and dynamically define a collection of chords based on their relation to a tonal center (or key) and their consonance level. Furthermore, two algorithmic strategies grounded in principles from function and root-motion harmonic theories allow the generation of chord progressions characteristic of Western tonal music.
Supervised Thesis
2020
Author
Diogo Miguel Filipe Cocharro
Institution
IPP-ESMAE
2018
Author
Eduardo Miguel Campos Magalhães
Institution
IPP-ESMAE
2017
Author
Gustavo Miguel Beça Rodrigues da Costa
Institution
IPP-ESMAE
2017
Author
Rodrigo Guerreiro Vaz Guedes de Carvalho
Institution
IPP-ESMAE
2017
Author
Rui Miguel Silva Sampaio Dias
Institution
IPP-ESMAE
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.