Tracking Two Distributions at Once? Evidence from Bilingual Speech Perception
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Speech perception involves substantial acoustic variability, requiring listeners to adapt to the statistical properties of speech cues. While previous studies have shown that listeners adjust their categorization of speech sounds based on the variability of acoustic cues, most evidence for this effect comes from monolingual contexts. It remains unclear whether bilingual listeners can track distributional statistics across two languages simultaneously. The present study examined distributional learning in bilingual speech perception using a Visual World Paradigm with eye-tracking. Two experiments tested whether bilingual listeners adjust their perception of the /b/–/p/ contrast based on the variance of voice onset time (VOT) distributions. Spanish–Basque bilinguals (Experiment 1) and Spanish–English bilinguals (Experiment 2) were exposed to narrow or wide VOT distributions in each language, with either the same variance or mixed variance across languages. Distributional learning was assessed using two measures: the slope of categorization functions based on mouse-click responses and the proportion of fixations to referents of phonological competitors. Across experiments, exposure to wider VOT distributions generally produced shallower categorization slopes and increased competitor fixations, indicating sensitivity to distributional variance. However, the strength of the effect varied depending on the language pair and the measure used to index speech perception. Overall, the findings demonstrate that distributional learning extends to bilingual contexts but is constrained by language structure, language experience, and the measure used to index perception.