Visual articulatory feedback for phonetic correction in second language learning

Fiche du document

Date

22 septembre 2010

Type de document
Périmètre
Langue
Identifiants
Collection

Archives ouvertes



Sujets proches En

Talking

Citer ce document

Pierre Badin et al., « Visual articulatory feedback for phonetic correction in second language learning », HALSHS : archive ouverte en Sciences de l’Homme et de la Société, ID : 10670/1.7u70bw


Métriques


Partage / Export

Résumé En

Orofacial clones can display speech articulation in an augmented mode, i.e. display all major speech articulators, including those usually hidden such as the tongue or the velum. Besides, a number of studies tend to show that the visual articulatory feedback provided by ElectroPalatoGraphy or ultrasound echography is useful for speech therapy. This paper describes the latest developments in acoustic-to-articulatory inversion, based on statistical models, to drive orofacial clones from speech sound. It suggests that this technology could provide a more elaborate feedback than previously available, and that it would be useful in the domain of Computer Aided Pronunciation Training.

document thumbnail

Par les mêmes auteurs

Sur les mêmes sujets

Sur les mêmes disciplines

Exporter en