Context Sensitivity across Multiple Time scales with a Flexible Frequency Bandwidth

Tamar I. Regev*, Geffen Markusfeld, Leon Y. Deouell, Israel Nelken

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Everyday auditory streams are complex, including spectro-temporal content that varies at multiple timescales. Using EEG, we investigated the sensitivity of human auditory cortex to the content of past stimulation in unattended sequences of equiprobable tones. In 3 experiments including 82 participants overall, we found that neural responses measured at different latencies after stimulus onset were sensitive to frequency intervals computed over distinct timescales. Importantly, early responses were sensitive to a longer history of stimulation than later responses. To account for these results, we tested a model consisting of neural populations with frequency-specific but broad tuning that undergo adaptation with exponential recovery. We found that the coexistence of neural populations with distinct recovery rates can explain our results. Furthermore, the adaptation bandwidth of these populations depended on spectral context - it was wider when the stimulation sequence had a wider frequency range. Our results provide electrophysiological evidence as well as a possible mechanistic explanation for dynamic and multiscale context-dependent auditory processing in the human cortex.

Original languageAmerican English
Pages (from-to)158-175
Number of pages18
JournalCerebral Cortex
Volume32
Issue number1
DOIs
StatePublished - 1 Jan 2022

Bibliographical note

Publisher Copyright:
© 2021 The Author(s) 2021. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: [email protected].

Keywords

  • EEG
  • ERP
  • adaptation
  • computational modeling
  • human auditory cortex

Fingerprint

Dive into the research topics of 'Context Sensitivity across Multiple Time scales with a Flexible Frequency Bandwidth'. Together they form a unique fingerprint.

Cite this