Conceptual complexity and the bias/variance tradeoff

被引:95
|
作者
Briscoe, Erica [2 ]
Feldman, Jacob [1 ]
机构
[1] Rutgers State Univ, Dept Psychol, Ctr Cognit Sci, Piscataway, NJ 08854 USA
[2] Georgia Tech Res Inst, Aerosp Transportat & Adv Syst Lab, Atlanta, GA USA
关键词
Concept learning; Complexity; Bias/variance; SIMPLICITY PRINCIPLE; MODEL; EXEMPLAR; CLASSIFICATION; SIMILARITY; CATEGORIZATION; PROTOTYPES; DISTANCE; STIMULUS; RULES;
D O I
10.1016/j.cognition.2010.10.004
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
In this paper we propose that the conventional dichotomy between exemplar-based and prototype-based models of concept learning is helpfully viewed as an instance of what is known in the statistical learning literature as the bias/variance tradeoff. The bias/variance tradeoff can be thought of as a sliding scale that modulates how closely any learning procedure adheres to its training data. At one end of the scale (high variance), models can entertain very complex hypotheses, allowing them to fit a wide variety of data very closely but as a result can generalize poorly, a phenomenon called overfitting. At the other end of the scale (high bias), models make relatively simple and inflexible assumptions, and as a result may fit the data poorly, called underfitting. Exemplar and prototype models of category formation are at opposite ends of this scale: prototype models are highly biased, in that they assume a simple, standard conceptual form (the prototype), while exemplar models have very little bias but high variance, allowing them to fit virtually any combination of training data. We investigated human learners' position on this spectrum by confronting them with category structures at variable levels of intrinsic complexity, ranging from simple prototype-like categories to much more complex multimodal ones. The results show that human learners adopt an intermediate point on the bias/variance continuum, inconsistent with either of the poles occupied by most conventional approaches. We present a simple model that adjusts (regularizes) the complexity of its hypotheses in order to suit the training data, which fits the experimental data better than representative exemplar and prototype models. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:2 / 16
页数:15
相关论文
共 50 条
  • [1] The Bias-Variance Tradeoff in Cognitive Science
    Doroudi, Shayan
    Rastegar, Seyed Ali
    COGNITIVE SCIENCE, 2023, 47 (01)
  • [2] The bias-variance tradeoff and the randomized GACV
    Wahba, G
    Lin, XW
    Gao, FY
    Xiang, D
    Klein, R
    Klein, B
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 11, 1999, 11 : 620 - 626
  • [3] On the Bias-Variance-Cost Tradeoff of Stochastic Optimization
    Hu, Yifan
    Chen, Xin
    He, Niao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Bias-Variance Tradeoff of Graph Laplacian Regularizer
    Chen, Pin-Yu
    Liu, Sijia
    IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (08) : 1118 - 1122
  • [5] Performance analysis of the adaptive algorithm for bias-to-variance tradeoff
    Stankovic, L
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (05) : 1228 - 1234
  • [6] Asset Pricing Model Uncertainty: A Tradeoff between Bias and Variance
    Zhou, Qing
    INTERNATIONAL REVIEW OF FINANCE, 2017, 17 (02) : 289 - 324
  • [7] Genetic programming, ensemble methods and the bias/variance tradeoff - Introductory investigations
    Keijzer, M
    Babovic, V
    GENETIC PROGRAMMING, PROCEEDINGS, 2000, 1802 : 76 - 90
  • [8] Prefrontal solution to the bias-variance tradeoff during reinforcement learning
    Kim, Dongjae
    Jeong, Jaeseung
    Lee, Sang Wan
    CELL REPORTS, 2021, 37 (13):
  • [9] Bandit Smooth Convex Optimization: Improving the Bias-Variance Tradeoff
    Dekel, Ofer
    Eldan, Ronen
    Koren, Tomer
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [10] High-Dimensional Data and the Bias Variance Tradeoff in Model Selection
    Menna, Eligo Workineh
    INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS & STATISTICS, 2024, 63 : 34 - 56