Cross-modal information integration in category learning

TitleCross-modal information integration in category learning
Publication TypeJournal Article
Year of Publication2014
AuthorsJ Smith, D., Johnston J. J. R., Musgrave R. D., Zakrzewski A. C., Boomer J., Church B. A., & F Ashby G.
JournalAttention, Perception, & Psychophysics
Volume76
Issue5
Pagination1473-1484
Date Published2014 Jul
ISSN1943-393X
KeywordsAuditory Perception, Cognition, Discrimination Learning, Humans, Mental Processes, Pitch Discrimination, Reaction Time, Visual Perception
Abstract

An influential theoretical perspective describes an implicit category-learning system that associates regions of perceptual space with response outputs by integrating information preattentionally and predecisionally across multiple stimulus dimensions. In this study, we tested whether this kind of implicit, information-integration category learning is possible across stimulus dimensions lying in different sensory modalities. Humans learned categories composed of conjoint visual-auditory category exemplars comprising a visual component (rectangles varying in the density of contained lit pixels) and an auditory component (in Exp. 1, auditory sequences varying in duration; in Exp. 2, pure tones varying in pitch). The categories had either a one-dimensional, rule-based solution or a two-dimensional, information-integration solution. Humans could solve the information-integration category tasks by integrating information across two stimulus modalities. The results demonstrated an important cross-modal form of sensory integration in the service of category learning, and they advance the field's knowledge about the sensory organization of systems for categorization.

DOI10.3758/s13414-014-0659-6
Alternate JournalAtten Percept Psychophys
PubMed ID24671743
PubMed Central IDPMC4096072
Grant ListP01 HD060563 / HD / NICHD NIH HHS / United States
P01 NS044393 / NS / NINDS NIH HHS / United States
HD-060563 / HD / NICHD NIH HHS / United States