Visually-guided compensation of deafening-induced song deterioration
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Human language learning and maintenance depend primarily on auditory feedback but are also shaped by other sensory modalities. Individuals who become deaf after learning to speak (post-lingual deafness) experience a gradual decline in their language abilities. A similar process occurs in songbirds, where deafness leads to progressive song deterioration. However, songbirds can modify their songs using non-auditory cues, challenging the prevailing assumption that auditory feedback is essential for vocal control. In this study, we investigated whether deafened birds could use visual cues to prevent or limit song deterioration. We developed a new metric for assessing syllable deterioration called the spectrogram divergence score. We then trained deafened birds in a behavioral task where the spectrogram divergence score of a target syllable was computed in real-time, triggering a contingent visual stimulus based on the score. Birds exposed to the contingent visual stimulus—a brief light extinction—showed more stable song syllables than birds that received either no light extinction or randomly triggered light extinction. Notably, this effect was specific to the targeted syllable and did not influence other syllables. This study demonstrates that deafness-induced song deterioration in birds can be partially mitigated with visual cues.