Relevance-based selectivity: The case of implicit learning

Baruch Eitam*, Roy Shoval, Arit Glicksohn, Asher Cohen, Yaacov Schul, Ran R. Hassin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

Learning the structure of the environment (e.g., what usually follows what) enables animals to behave in an effective manner and prepare for future events. Unintentional learning is capable of efficiently producing such knowledge as has been demonstrated with the Artificial Grammar Learning paradigm (AGL), among others. It has been argued that selective attention is a necessary and sufficient condition for visual implicit learning. Experiment 1 shows that spatial attention is not sufficient for implicit learning. Learning does not occur if the stimuli instantiating the structure are task irrelevant. In a second experiment, we demonstrate that this holds even with abundance of available attentional resources. Together, these results challenge the current view of the relations between attention, resources, and implicit learning.

Original languageEnglish
Pages (from-to)1508-1515
Number of pages8
JournalJournal of Experimental Psychology: Human Perception and Performance
Volume39
Issue number6
DOIs
StatePublished - Dec 2013

Keywords

  • Artificial Grammar Learning
  • Implicit learning
  • Perceptual load
  • Selective attention
  • Spatial attention
  • Task relevance

Fingerprint

Dive into the research topics of 'Relevance-based selectivity: The case of implicit learning'. Together they form a unique fingerprint.

Cite this