2017
Authors
Jin, CT; Davies, MEP; Campisi, P;
Publication
IEEE SIGNAL PROCESSING MAGAZINE
Abstract
Foot-tapping and moving to music is such a natural human activity, one may assume that feeling the beat in music is a simple task. Feeling the beat and then producing it, e.g., by foot tapping, is an intrinsically real-time process. As listeners, we do not wait for the beat to occur before tapping our foot; instead, we make predictions about when the next beat in the music will occur and continually revise our sense of the beat based on the accuracy of our predictions. Likewise, performing musicians have shared sense of beat, which is what allows them to play in time together. © 2017 IEEE.
2018
Authors
Sioros, G; Davies, MEP; Guedes, C;
Publication
JOURNAL OF NEW MUSIC RESEARCH
Abstract
We present a novel model for the characterization of musical rhythms that is based on the pervasive rhythmic phenomenon of syncopation. Syncopation is felt when the sensation of the regular beat or pulse in the music is momentarily disrupted; the feeling arises from breaking more expected patterns such as pickups (anacrusis) and faster events that introduce and bridge the notes articulated on the beats. Our model begins with a simple pattern that articulates a beat consistent with the metrical expectations of a listener. Any rhythm is then generated from a unique combination of transformations applied on that simple pattern. Each transformation introduces notes in off-beat positions as one of three basic characteristic elements: (1) syncopations, (2) pickup rhythmic figures and (3) faster notes that articulate a subdivision of the beat. The characterization of a pattern is based on an algorithm that discovers and reverses the transformations in a stepwise manner. We formalize the above transformations and present the characterization algorithm, and then demonstrate and discuss the model through the analysis of the main rhythmic pattern of the song Don't stop till you get enough' by Michael Jackson.
2015
Authors
Hockman, JA; Davies, MEP;
Publication
DAFx 2015 - Proceedings of the 18th International Conference on Digital Audio Effects
Abstract
The dance music genres of hardcore, jungle and drum and bass (HJDB) emerged in the United Kingdom during the early 1990s as a result of affordable consumer sampling technology and the popularity of rave music and culture. A key attribute of these genres is their usage of fast-paced drums known as breakbeats. Automated analysis of breakbeat usage in HJDB would allow for novel digital audio effects and musicological investigation of the genres. An obstacle in this regard is the automated identification of breakbeats used in HJDB music. This paper compares three strategies for breakbeat detection: (1) a generalised frame-based music classification scheme; (2) a specialised system that segments drums from the audio signal and labels them with an SVM classifier; (3) an alternative specialised approach using a deep network classifier. The results of our evaluations demonstrate the superiority of the specialised approaches, and highlight the need for style-specific workflows in the determination of particular musical attributes in idiosyncratic genres. We then leverage the output of the breakbeat classification system to produce an automated breakbeat sequence reconstruction, ultimately recreating the HJDB percussion arrangement.
2017
Authors
Bernardes, G; Davies, MEP; Guedes, C;
Publication
Music Technology with Swing - 13th International Symposium, CMMR 2017, Matosinhos, Portugal, September 25-28, 2017, Revised Selected Papers
Abstract
We present a hierarchical harmonic mixing method for assisting users in the process of music mashup creation. Our main contributions are metrics for computing the harmonic compatibility between musical audio tracks at small- and large-scale structural levels, which combine and reassess existing perceptual relatedness (i.e., chroma vector similarity and key affinity) and dissonance-based approaches. Underpinning our harmonic compatibility metrics are harmonic indicators from the perceptually-motivated Tonal Interval Space, which we adapt to describe musical audio. An interactive visualization shows hierarchical harmonic compatibility viewpoints across all tracks in a large musical audio collection. An evaluation of our harmonic mixing method shows our adaption of the Tonal Interval Space robustly describes harmonic attributes of musical instrument sounds irrespective of timbral differences and demonstrates that the harmonic compatibility metrics comply with the principles embodied in Western tonal harmony to a greater extent than previous approaches. © 2018, Springer Nature Switzerland AG.
2018
Authors
Aramaki, M; Davies, MEP; Kronland Martinet, R; Ystad, S;
Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
2018
Authors
Aramaki, M; Davies, MEP; Martinet, RK; Ystad, S;
Publication
CMMR
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.