Abstract
The study of spatial processing in the auditory system usually requires complex experimental setups, using arrays of speakers or speakers mounted on moving arms. These devices, while allowing precision in the presentation of the spatial attributes of sound, are complex, expensive and limited. Alternative approaches rely on virtual space sound delivery. In this paper, we describe a virtual space algorithm that enables accurate reconstruction of eardrum waveforms for arbitrary sound sources moving along arbitrary trajectories in space. A physical validation of the synthesis algorithm is performed by comparing waveforms recorded during real motion with waveforms synthesized by the algorithm. As a demonstration of possible applications of the algorithm, virtual motion stimuli are used to reproduce psychophysical results in humans and for studying responses of barn owls to auditory motion stimuli.
Original language | English |
---|---|
Pages (from-to) | 29-38 |
Number of pages | 10 |
Journal | Journal of Neuroscience Methods |
Volume | 106 |
Issue number | 1 |
DOIs | |
State | Published - 30 Mar 2001 |
Bibliographical note
Funding Information:The authors thank Professor Hermann Wagner and Nachum Ulanovsky for comments on the manuscript, and Yehoshua Yehuda for help with programming the motor. This work was supported by a grant from the German–Israeli Foundation (GIF).
Keywords
- Auditory motion
- Barn owl
- HRTF
- Physiology
- Psychophysics
- Sound localization
- Sound synthesis
- Virtual space