Authors
Nicholas B Turk-Browne, Justin A Jungé, Brian J Scholl
Publication date
2005/11
Journal
Journal of Experimental Psychology: General
Volume
134
Issue
4
Pages
552
Publisher
American Psychological Association
Description
The visual environment contains massive amounts of information involving the relations between objects in space and time, and recent studies of visual statistical learning (VSL) have suggested that this information can be automatically extracted by the visual system. The experiments reported in this article explore the automaticity of VSL in several ways, using both explicit familiarity and implicit response-time measures. The results demonstrate that (a) the input to VSL is gated by selective attention,(b) VSL is nevertheless an implicit process because it operates during a cover task and without awareness of the underlying statistical patterns, and (c) VSL constructs abstracted representations that are then invariant to changes in extraneous surface features. These results fuel the conclusion that VSL both is and is not automatic: It requires attention to select the relevant population of stimuli, but the resulting learning …
Total citations
20062007200820092010201120122013201420152016201720182019202020212022202320246132229243036334651648664778062898350
Scholar articles
NB Turk-Browne, JA Jungé, BJ Scholl - Journal of Experimental Psychology: General, 2005