Speech and Acoustic Sound Analysis for Cochlear Implants/hearing Aids System: Advancements With Machine Learning Environmental Sound Perception and Safety Assessment

Date

2022-08-01T05:00:00.000Z

ORCID

Journal Title

Journal ISSN

Volume Title

Publisher

item.page.doi

Abstract

Cochlear Implant (CI) and Hearing Aid (HA) Research Platforms (RP) are commonly used by the research community to propose new algorithms, conduct scientific field-tests, and explore hearing/perception-based rehabilitation protocols. RPs are generally assumed to be safe for any customization or any new algorithm development, and many researchers typically perform baseline tests to verify the functionality and address any safety concerns. In this thesis, a two related research goals are addressed: (i) CI research platform Burn-In safety protocol, and (ii) environmental acoustic sound classification/assessment for smart space CI systems. In the first task, a two-phase analysis and safety evaluation protocol that can be systematically applied to assess any RP in two phases is proposed, namely, (i) acoustic phase and (ii) electric stimulation parameter phase. In the acoustic phase, the output electric/acoustic stimulation of RP for diverse acoustic conditions are assessed for safety compliance and performance evaluation. In the stimulation parameter phase, the reliability of a RP to accurately generate electrical stimuli that are compliant with the established limits for safety are explored. The proposed “Burn-In” evaluation protocol can be applied to any RP, and in this work, the Costakis Cochlear Implant Mobile (CCi-MOBILE) RP is used for assessment. Additionally, guidelines for addressing experimental variability such as custom algorithms, stimulation techniques, and best practices for subsampling of the acoustic and parameter test spaces are addressed. Next, the second thesis goal/task, acoustic sound space analysis and knowledge characterization is addressed. In general, non-linguistic sound (NLS) perception plays an important role in engaging listener response to various safety scenarios and enabling user autonomy, environmental awareness. However, most CI research efforts are speech driven and improved perception of environmental sounds are largely assumed, with some suggesting no benefit in NLS perception following implantation. In this work, Convolutional Neural Network (CNN) based NLS classification models are used to compare CI-simulated and NH classification performance with NH and CI human listener performance. Furthermore, a novel NLS enhancement algorithm is proposed to improve NLS perception among CI listeners. The NLS enhancement algorithm focuses on estimating filter gains that optimally preserve the perceptually important spectro-temporal characteristics in CI processing. The proposed NLS enhancement algorithm is evaluated for improvement in identification using the NLS classification models. A competing NLS source model is also developed using a mixture of two sound sources (i) ‘target’ (safety/threat interest) and (ii) ‘interference’. An end-to-end (E2E) deep source separation algorithm and proposed NLS enhancement approaches are applied to assess improvement in ‘target' perception. NLS perceptual characteristics are comparatively evaluated in baseline-mixed, source separated, and source separated + NLS enhanced modes among CI and NH listeners. Listener responses based on interference, audio quality, and distortion measures are recorded using a subjective scale and forced choice pairwise preference based on a comparative assessment of source separation and source separation + NLS enhancement techniques. Taken collectively, the contributions from CI research platform , Burn-In safety assessment protocol, the first of its kind in the field, along with the non-linguistic sound source analysis, enhancement and classification based on machine learning/CNN model have contributed towards the advancements for next generation CI systems.

Description

Keywords

Engineering, Electronics and Electrical, Computer Science, Artificial Intelligence

item.page.sponsorship

Rights

Citation