2009
Authors
Davies, MEP; Plumbley, MD; Eck, D;
Publication
IEEE Workshop on Applications of Signal Processing to Audio and Acoustics
Abstract
We present a new method for generating input features for musical audio beat tracking systems. To emphasise periodic structure we derive a weighted linear combination of sub-band onset detection functions driven a measure of sub-band beat strength. Results demonstrate improved performance over existing state of the art models, in particular for musical excerpts with a steady tempo. ©2009 IEEE.
2007
Authors
McKinney, MF; Moelants, D; Davies, MEP; Klapuri, A;
Publication
JOURNAL OF NEW MUSIC RESEARCH
Abstract
This is an extended analysis of eight different algorithms for musical tempo extraction and beat tracking. The algorithms participated in the 2006 Music Information Retrieval Evaluation eXchange (MIREX), where they were evaluated using a set of 140 musical excerpts, each with beats annotated by 40 different listeners. Performance metrics were constructed to measure the algorithms' abilities to predict the most perceptually salient musical beats and tempi of the excerpts. Detailed results of the evaluation are presented here and algorithm performance is evaluated as a function of musical genre, the presence of percussion, musical meter and the most salient perceptual tempo of each excerpt.
2012
Authors
Streeter, E; Davies, MEP; Reiss, JD; Hunt, A; Caley, R; Roberts, C;
Publication
ARTS IN PSYCHOTHERAPY
Abstract
Research indicates that music therapists are likely to make use of computer software, designed to measure changes in the way a patient and therapist make use of music in music therapy sessions. A proof of concept study investigated whether music analysis algorithms (designed to retrieve information from commercial music recordings) can be adapted to meet the needs of music therapists. Computational music analysis techniques were applied to multi-track audio recordings of simulated sessions, then to recordings of individual music therapy sessions: these were recorded by a music therapist as part of her ongoing practice with patients with acquired brain injury. The music therapist wanted to evaluate two hypotheses: one, whether changes in her tempo were affecting the tempo of a patient's play on acoustic percussion instruments, and two, whether her musical interventions were helping the patient reduce habituated, rhythmic patterning. Automatic diagrams were generated that gave a quick overview of the instrumental activity contained within each session: when, and for how long each instrument was played. From these, computational analysis was applied to musical areas of specific interest. The results of the interdisciplinary team work, audio recording tests, computer analysis tests, and music therapy field tests are presented and discussed.
2005
Authors
Davies, MEP; Brossier, PM; Plumbley, MD;
Publication
Audio Engineering Society - 118th Convention Spring Preprints 2005
Abstract
In this paper we address the issue of causal rhythmic analysis, primarily towards predicting the locations of musical beats such that they are consistent with a musical audio input. This will be a key component required for a system capable of automatic accompaniment with a live musician. We are implementing our approach as part of the aubio real-time audio library. While performance for this causal system is reduced in comparison to our previous non-causal system, it is still suitable for our intended purpose.
2007
Authors
Stark, AM; Plumbley, MD; Davies, MEP;
Publication
Audio Engineering Society - 122nd Audio Engineering Society Convention 2007
Abstract
We present a new class of digital audio effects which can automatically relate parameter values to the tempo of a musical input in real-time. Using a beat tracking system as the front end, we demonstrate a tempo-dependent delay effect and a set of beat-synchronous low frequency oscillator (LFO) effects including auto-wah, tremolo and vibrato. The effects show better performance than might be expected as they are blind to certain beat tracker errors. All effects are implemented as VST plug-ins which operate in real-time, enabling their use both in live musical performance and the off-line modification of studio recordings.
2007
Authors
Jacobson, K; Davies, M; Sandler, M;
Publication
Audio Engineering Society - 123rd Audio Engineering Society Convention 2007
Abstract
Music information retrieval encompasses a complex and diverse set of problems. Some recent work has focused on automatic textual annotation of audio data, paralleling work in image retrieval. Here we take a narrower approach to the automatic textual annotation of music signals and focus on rhythmic style. Training data for rhythmic styles are derived from simple, precisely labeled drum loops intended for content creation. These loops are already textually annotated with the rhythmic style they represent. The training loops are then compared against a database of music content to apply textual annotations of rhythmic style to unheard music signals. Three distinct methods of rhythmic analysis are explored. These methods are tested on a small collection of electronic dance music resulting in a labeling accuracy of 73%.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.