Please use this identifier to cite or link to this item:
https://idr.l4.nitk.ac.in/jspui/handle/123456789/8955
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Heshi, R. | |
dc.contributor.author | Suma, S.M. | |
dc.contributor.author | Koolagudi, S.G. | |
dc.contributor.author | Bhandari, S. | |
dc.contributor.author | Rao, K.S. | |
dc.date.accessioned | 2020-03-30T10:23:07Z | - |
dc.date.available | 2020-03-30T10:23:07Z | - |
dc.date.issued | 2016 | |
dc.identifier.citation | Smart Innovation, Systems and Technologies, 2016, Vol.43, , pp.603-609 | en_US |
dc.identifier.uri | http://idr.nitk.ac.in/jspui/handle/123456789/8955 | - |
dc.description.abstract | In this work, an effort has been made to analyze rhythm and timbre related features to identify raga and tala from a piece of Carnatic music. Raga and Tala classification is performed using both rhythm and timbre features. Rhythm patterns and rhythm histogram are used as rhythm features. Zero crossing rate (ZCR), centroid, spectral roll-off, flux, entropy are used as timbre features. Music clips contain both instrumental and vocals. To find similarity between the feature vectors T-Test is used as a similarity measure. Further, classification is done using Gaussian Mixture Models (GMM). The results shows that the rhythm patterns are able to distinguish different ragas and talas with an average accuracy of 89.98 and 86.67 % respectively. � Springer India 2016. | en_US |
dc.title | Rhythm and timbre analysis for carnatic music processing | en_US |
dc.type | Book chapter | en_US |
Appears in Collections: | 2. Conference Papers |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.