This paper will investigate viability of a screening application that could be used to identify individuals with Dysarthria from among a larger population using sentence-level speech data. This task presents a number of challenged particularly if we aim to identify the disorder in the earlier stages before the more significant symptoms have begun to manifest themselves. A principal challenge in this task is acheiving robustness to the large number of confounding variables such as gender, age, accent, speaking style, and recording conditions. All of these variables will affect an individuals speech in a manner unrelated to the disorder, and identifying what information is relevant to the disorder amongst these confounding variables given the limited amount of data that is available in this regime presents a major engineering challenge. In this paper we will focus on achieving robustness to different types and levels of noise by employing a feature selection algorithm that attempts to minimize a non-parametric upper bound of the error in the noisy condition. This is a crucial problem to solve as the clean recording conditions used in data collection are typically a poor reflection of the type of data that will be encountered upon deployment.

Original languageEnglish (US)
Title of host publication2016 Digital Media Industry and Academic Forum, DMIAF 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages4
ISBN (Electronic)9781509010004
StatePublished - Sep 22 2016
Event2016 Digital Media Industry and Academic Forum, DMIAF 2016 - Santorini, Greece
Duration: Jul 4 2016Jul 6 2016

Publication series

Name2016 Digital Media Industry and Academic Forum, DMIAF 2016 - Proceedings


Other2016 Digital Media Industry and Academic Forum, DMIAF 2016

ASJC Scopus subject areas

  • Computer Science Applications
  • Media Technology


Dive into the research topics of 'Noise robust dysarthric speech classification using domain adaptation'. Together they form a unique fingerprint.

Cite this