Visual Feedback of Tongue Movement for Novel Speech Sound Learning

Date

ORCID

Journal Title

Journal ISSN

Volume Title

Publisher

Frontiers Media S. A.

item.page.doi

Abstract

Pronunciation training studies have yielded important information concerning the processing of audiovisual (AV) information. Second language (L2) learners show increased reliance on bottom-up, multimodal input for speech perception (compared to monolingual individuals). However, little is known about the role of viewing one’s own speech articulation processes during speech training. The current study investigated whether real-time, visual feedback for tongue movement can improve a speaker’s learning of non-native speech sounds. An interactive 3D tongue visualization system based on electromagnetic articulography (EMA) was used in a speech training experiment. Native speakers of American English produced a novel speech sound (/ɖ/; a voiced, coronal, palatal stop) before, during, and after trials in which they viewed their own speech movements using the 3D model. Talkers’ productions were evaluated using kinematic (tongue-tip spatial positioning) and acoustic (burst spectra) measures. The results indicated a rapid gain in accuracy associated with visual feedback training. The findings are discussed with respect to neural models for multimodal speech processing. .

Description

Keywords

Articulation disorders, Language and languages—Study and teaching—Audio-visual aids, Electromagnetic articulography, Second language acquisition, Speech

item.page.sponsorship

Partially supported by NIH-SBIR 1 R43 DC013467

Rights

CC BY 4.0 (Attribution), ©2015 The Authors

Citation

Collections