Visual Statistical Learning Based on Time Information

被引:1
|
作者
Otsuka, Sachio [1 ,2 ]
机构
[1] Doshisha Univ, Fac Culture & Informat Sci, Kyoto, Japan
[2] Kyoto Univ, Grad Sch Human & Environm Studies, Kyoto, Japan
关键词
object recognition; perceived control of time; time management; time perception; visual statistical learning; TEMPORAL STRUCTURE; IMPLICIT; ATTENTION; DURATION; MANAGEMENT; CHILDREN; CONTEXT; REGULARITIES; COMPRESSION; EXPECTANCY;
D O I
10.1037/xge0001276
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
People can extract and learn statistical regularities from various aspects of everyday life. The current study examined whether people have a mechanism to learn regularity based on time information and investigated whether sensitivity to time information is modulated by individual time management. In the familiarization phase, participants were required to observe a visual sequence of objects. Although the objects were presented in a random order, the amount of time for which the objects were presented was organized into successive triplets (e.g., 850-1,000-700 ms). In the subsequent test phase, two three-object sequences were presented. One sequence was a timing triplet that had temporal regularities. The other was a foil created from three different triplets. Participants were required to judge which sequence was more familiar based on the familiarization phase. The results showed that the triplets were successfully discriminated from the foils. These results were also observed for blank intervals. The current findings also revealed that although visual statistical learning was expressed when participants observed the temporal regularities of shapes tied to the corresponding durations during familiarization, this learning overshadowed them from indicating generic timing regularities when they were untied to objects. Furthermore, participants with high scores on the Time Management Scale showed a higher extent of visual statistical learning on object durations than those with low scores. These results suggest that people extract and learn regularities based on time information and that statistical learning based on time information is correlated with individual time management.
引用
收藏
页码:363 / 388
页数:26
相关论文
共 50 条
  • [21] The infant motor system predicts actions based on visual statistical learning
    Monroy, Claire D.
    Meyer, Marlene
    Schroer, Lisanne
    Gerson, Sarah A.
    Hunnius, Sabine
    NEUROIMAGE, 2019, 185 : 947 - 954
  • [22] Visual tracking based on transfer learning of deep salience information
    Haorui Zuo
    Zhiyong Xu
    Jianlin Zhang
    Ge Jia
    Opto-Electronic Advances, 2020, 3 (09) : 33 - 43
  • [23] Information Theoretic Learning for Pixel-Based Visual Agents
    Gori, Marco
    Melacci, Stefano
    Lippi, Marco
    Maggini, Marco
    COMPUTER VISION - ECCV 2012, PT VI, 2012, 7577 : 864 - 875
  • [24] Visual statistical learning requires attention
    Duncan, Dock H.
    van Moorselaar, Dirk
    Theeuwes, Jan
    PSYCHONOMIC BULLETIN & REVIEW, 2024,
  • [25] Effects of tDCS on visual statistical learning
    Nydam, Abbey S.
    Sewell, David K.
    Dux, Paul E.
    NEUROPSYCHOLOGIA, 2020, 148
  • [26] Visual tracking based on transfer learning of deep salience information
    Zuo, Haorui
    Xu, Zhiyong
    Zhang, Jianlin
    Jia, Ge
    OPTO-ELECTRONIC ADVANCES, 2020, 3 (09): : 1 - 11
  • [27] How Implicit Is Visual Statistical Learning?
    Bertels, Julie
    Franco, Ana
    Destrebecqz, Arnaud
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2012, 38 (05) : 1425 - 1431
  • [28] Reward impacts visual statistical learning
    Park, Su Hyoun
    Rogers, Leeland L.
    Johnson, Matthew R.
    Vickery, Timothy J.
    COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE, 2021, 21 (06) : 1176 - 1195
  • [29] Visual statistical learning in the newborn infant
    Bulf, Hermann
    Johnson, Scott P.
    Valenza, Eloisa
    COGNITION, 2011, 121 (01) : 127 - 132
  • [30] Reward impacts visual statistical learning
    Su Hyoun Park
    Leeland L. Rogers
    Matthew R. Johnson
    Timothy J. Vickery
    Cognitive, Affective, & Behavioral Neuroscience, 2021, 21 : 1176 - 1195