Training Set Coherence and Set Size Effects on Concept Generalization and Recognition

被引:29
|
作者
Bowman, Caitlin R. [1 ]
Zeithamova, Dagmar [1 ]
机构
[1] Univ Oregon, Dept Psychol, Eugene, OR 97403 USA
关键词
category learning; long-term memory; generalization; computational modeling; MULTIPLE MEMORY-SYSTEMS; EXEMPLAR-BASED ACCOUNTS; CATEGORICAL PERCEPTION; WORD SEGMENTATION; PROTOTYPE; ABSTRACTION; CATEGORIZATION; INFORMATION; STIMULUS; MODEL;
D O I
10.1037/xlm0000824
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Building conceptual knowledge that generalizes to novel situations is a key function of human memory. Category-learning paradigms have long been used to understand the mechanisms of knowledge generalization. In the present study, we tested the conditions that promote formation of new concepts. Participants underwent 1 of 6 training conditions that differed in the number of examples per category (set size) and their relative similarity to the category average (set coherence). Performance metrics included rates of category learning, ability to generalize categories to new items of varying similarity to prototypes, and recognition memory for individual examples. In categorization, high set coherence led to faster learning and better generalization, while set size had little effect. Recognition did not differ reliably among conditions. We also tested the nature of memory representations used for categorization and recognition decisions using quantitative prototype and exemplar models fit to behavioral responses. Prototype models posit abstract category representations based on the category's central tendency, whereas exemplar models posit that categories are represented by individual category members. Prototype strategy use during categorization increased with increasing set coherence, suggesting that coherent training sets facilitate extraction of commonalities within a category. We conclude that learning from a coherent set of examples is an efficient means of forming abstract knowledge that generalizes broadly.
引用
收藏
页码:1442 / 1464
页数:23
相关论文
共 50 条
  • [21] The training set and generalization in grammatical evolution for autonomous agent navigation
    Naredo, Enrique
    Urbano, Paulo
    Trujillo, Leonardo
    SOFT COMPUTING, 2017, 21 (15) : 4399 - 4416
  • [22] Stratified Learning for Reducing Training Set Size
    Hastings, Peter
    Hughes, Simon
    Blaum, Dylan
    Wallace, Patricia
    Britt, M. Anne
    INTELLIGENT TUTORING SYSTEMS, ITS 2016, 2016, 9684 : 341 - 346
  • [23] LEARNING COEFFICIENT DEPENDENCE ON TRAINING SET SIZE
    EATON, HAC
    OLIVIER, TL
    NEURAL NETWORKS, 1992, 5 (02) : 283 - 288
  • [24] On the size of training set and the benefit from ensemble
    Zhou, ZH
    Wei, D
    Li, G
    Dai, HH
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS, 2004, 3056 : 298 - 307
  • [25] Training radial basis function neural networks: effects of training set size and imbalanced training sets
    Al-Haddad, L
    Morris, CW
    Boddy, L
    JOURNAL OF MICROBIOLOGICAL METHODS, 2000, 43 (01) : 33 - 44
  • [26] EFFECTS OF ORTHOGRAPHIC SET SIZE AND CONGRUENCY ON WORD-FRAGMENT COMPLETION AND RECOGNITION
    FLEXSER, AJ
    BULLETIN OF THE PSYCHONOMIC SOCIETY, 1987, 25 (05) : 327 - 327
  • [27] RESPONSE SET EFFECTS IN RECOGNITION MEMORY
    CORBALLI.MC
    ROLDAN, CE
    ZBRODOFF, J
    MEMORY & COGNITION, 1974, 2 (03) : 501 - 508
  • [28] A Generalization of Odd Set Inequalities for the Set Packing Problem
    Heismann, Olga
    Borndoerfer, Ralf
    OPERATIONS RESEARCH PROCEEDINGS 2013, 2014, : 193 - 199
  • [29] Multiple classifier systems in offline handwritten word recognition -: On the influence of training set and vocabulary size
    Günter, S
    Bunke, H
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2004, 18 (07) : 1303 - 1320
  • [30] Face Recognition Improvement with Distortions of Images in Training Set
    Kussul, Ernst
    Baidyk, Tatiana
    Conde, Cristina
    Martin de Diego, Isaac
    Cabello, Enrique
    2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,