Please use this identifier to cite or link to this item:
https://idr.l4.nitk.ac.in/jspui/handle/123456789/10214
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Nayak, J. | |
dc.contributor.author | Bhat, P.S. | |
dc.contributor.author | Acharya, R. | |
dc.contributor.author | Aithal, U.V. | |
dc.date.accessioned | 2020-03-31T08:18:44Z | - |
dc.date.available | 2020-03-31T08:18:44Z | - |
dc.date.issued | 2005 | |
dc.identifier.citation | ITBM-RBM, 2005, Vol.26, 43987, pp.319-327 | en_US |
dc.identifier.uri | http://idr.nitk.ac.in/jspui/handle/123456789/10214 | - |
dc.description.abstract | Analysis of speech has become a popular non-invasive tool for assessing the speech abnormalities. Acoustic nature of the abnormal speech gives relevant information about the type of disorder in the speech production system. These signals are essentially non-stationary; may contain indicators of current disease, or even warnings about impending diseases. The indicators may be present at all times or may occur at random -:during certain intervals of the day. However, to study and pinpoint abnormalities in voluminous data collected over several hours is strenuous and time consuming. Therefore, computer based analytical tools for in-depth study and classification of data over daylong intervals can be very useful in diagnostics. This paper deals with the classification of certain diseases using artificial neural network, and then analyzed. This analysis is carried out using continuous wavelet transformation patterns. The results for various types of subjects discussed in detail and it is evident that the classifier presented in this paper has a remarkable efficiency in the range of 80-85% of accuracy. 2005 Elsevier SAS. All rights reserved. | en_US |
dc.title | Classification and analysis of speech abnormalities | en_US |
dc.type | Article | en_US |
Appears in Collections: | 1. Journal Articles |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.