Auditory motion perception emerges from successive sound localizations integrated over time.

with V. Roggerone, C. Tarlao and C Guastavino.

    We provide a connection between sound localization and sound motion perception through the upper limit (UL) speed of sound direction discrimination. We propose a spectral cue that explains both front-back confusion rates and the variation of the UL speed with the spectral content when accounting for a minimal integration time in the auditory cortex..

    1. Roggerone, V., Vacher, J., Tarlao, C. & Guastavino, C. Auditory motion perception emerges from successive sound localizations integrated over time. Scientific Reports 9, 16437 (2019).

    Abstract

    Humans rely on auditory information to estimate the path of moving sound sources. But unlike in vision, the existence of motion-sensitive mechanisms in audition is still open to debate. Psychophysical studies indicate that auditory motion perception emerges from successive localization, but existing models fail to predict experimental results. However, these models do not account for any temporal integration. We propose a new model tracking motion using successive localization snapshots but integrated over time. This model is derived from psychophysical experiments on the upper limit for circular auditory motion perception (UL), defned as the speed above which humans no longer identify the direction of sounds spinning around them. Our model predicts ULs measured with diferent stimuli using solely static localization cues. The temporal integration blurs these localization cues rendering them unreliable at high speeds, which results in the UL. Our fndings indicate that auditory motion perception does not require motion-sensitive mechanisms.


    © 2019. All rights reserved.

    Powered by Hydejack v8.4.0