Brain activity discriminates acoustic simulations of the same environment
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
In complex acoustic environments, sound localization involves the integration of numerous interrelated auditory and cognitive cues, making it challenging to understand their relationship to brain activity. Here, we use virtual acoustics to probe the brain’s response to auditory distance cues in a realistic environment. We developed a system to record the actual MRI environment, simulated the same room with different degrees of accuracy, then presented sounds at one of two locations in the room. We implemented a novel auditory fMRI sequence to record brain activity. Despite only minor differences in acoustics between the auralizations, it was possible to decode all three rooms from brain activity. A systematic analysis revealed that the direct-to-reverberant energy ratio (DRR) drove brain activity differences between auralizations, centered on the posterior auditory cortex (AC). The results provide strong evidence that the posterior AC processes DRR for spatial auditory perception.
Impact statement
A novel fMRI sequence and recording technique are combined with virtual acoustics and multi-variate analyses to decode room simulations from brain activity during distance perception and identify the auditory factors that drive the pattern of activity in the brain.