Details
Name
Juan Gariel ColonnaRole
External Research CollaboratorSince
01st November 2015
Nationality
ArgentinaCentre
Artificial Intelligence and Decision SupportContacts
+351220402963
juan.g.colonna@inesctec.pt
2016
Authors
Colonna, J; Peet, T; Ferreira, CA; Jorge, AM; Gomes, EF; Gama, J;
Publication
Proceedings of the Ninth International C* Conference on Computer Science & Software Engineering, C3S2E '16, Porto, Portugal, July 20-22, 2016
Abstract
Anurans (frogs or toads) are closely related to the ecosystem and they are commonly used by biologists as early indicators of ecological stress. Automatic classification of anurans, by processing their calls, helps biologists analyze the activity of anurans on larger scale. Wireless Sensor Networks (WSNs) can be used for gathering data automatically over a large area. WSNs usually set restrictions on computing and transmission power for extending the network's lifetime. Deep Learning algorithms have gathered a lot of popularity in recent years, especially in the field of image recognition. Being an eager learner, a trained Deep Learning model does not need a lot of computing power and could be used in hardware with limited resources. This paper investigates the possibility of using Convolutional Neural Networks with Mel-Frequency Cepstral Coefficients (MFCCs) as input for the task of classifying anuran sounds. © 2016 ACM.
2016
Authors
Colonna, JG; Gama, J; Nakamura, EF;
Publication
DISCOVERY SCIENCE, (DS 2016)
Abstract
In bioacoustic recognition approaches, a "flat" classifier is usually trained to recognize several species of anuran, where the number of classes is equal to the number of species. Consequently, the complexity of the classification function increases proportionally to the amount of species. To avoid this issue we propose a "hierarchical" approach that decomposes the problem into three taxonomic levels: the family, the genus, and the species level. To accomplish this, we transform the original single-label problem into a multi-dimensional problem (multi-label and multi-class) considering the Linnaeus taxonomy. Then, we develop a top-down method using a set of classifiers organized as a hierarchical tree. Thus, it is possible to predict the same set of species as a flat classifier, and additionally obtain new information about the samples and their taxonomic relationship. This helps us to understand the problem better and achieve additional conclusions by the inspection of the confusion matrices at the three levels of classification. In addition, we carry out our experiments using a Cross-Validation performed by individuals. This form of CV avoids mixing syllables that belong to the same specimens in the testing and training sets, preventing an overestimate of the accuracy and generalizing the predictive capabilities of the system. We tested our system in a dataset with sixty individual frogs, from ten different species, eight genus, and four families, achieving a final Micro-and Average-accuracy equal to 86% and 62% respectively.
2016
Authors
Colonna, JG; Gama, J; Nakamura, EF;
Publication
ADVANCES IN ARTIFICIAL INTELLIGENCE, CAEPIA 2016
Abstract
In this work, we introduce a more appropriate (or alternative) approach to evaluate the performance and the generalization capabilities of a framework for automatic anuran call recognition. We show that, by using the common k-folds Cross-Validation (k-CV) procedure to evaluate the expected error in a syllable-based recognition system the recognition accuracy is overestimated. To overcome this problem, and to provide a fair evaluation, we propose a new CV procedure in which the specimen information is considered during the split step of the k-CV. Therefore, we performed a k-CV by specimens (or individuals) showing that the accuracy of the system decrease considerably. By introducing the specimen information, we are able to answer a more fundamental question: Given a set of syllables that belongs to a specific group of individuals, can we recognize new specimens of the same species? In this article, we go deeper into the reviews and the experimental evaluations to answer this question.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.