Perceptual learning and sensorimotor learning with cochlear-implant simulated speech feedback

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Cochlear implants (CIs) provide deaf individuals with access to auditory feedback from their own voice during production. This experiment investigated whether typical hearing participants can use CI simulated speech feedback for perceptual learning and sensorimotor control of speech. CI simulation was achieved via noise vocoding, a technique that degrades the spectral detail in a speech signal in a manner similar to a CI. 32 participants took part in the experiment. First, participants were tested on their recognition of noise vocoded sentences before and after a training task; either perception training, where participants listened to noise vocoded sentences while reading matching text; or production training, where participants read aloud sentences whilst hearing their own voice noise-vocoded in real-time. Both groups of participants then underwent a speech motor adaptation paradigm in which formants were perturbed in real-time noise vocoded speech auditory feedback. Both perception and production training tasks resulted in significant improvements in recognition of noise vocoded sentences, with no effect of training type. Speech motor adaptation however was not significant at the group level in response to the formant perturbations. This suggests that successful perceptual learning for degraded speech is not sufficient for successful sensorimotor learning with degraded auditory feedback.

Article activity feed