Calibration to tool use during visually-guided reaching

被引:17
|
作者
Day, Brian [1 ]
Ebrahimi, Elham [2 ]
Hartman, Leah S. [3 ]
Pagano, Christopher C. [3 ]
Babu, Sabarish V. [2 ]
机构
[1] Butler Univ, Dept Psychol, 282 Jordan Hall, Indianapolis, IN 46208 USA
[2] Clemson Univ, Sch Comp, Clemson, SC 29631 USA
[3] Clemson Univ, Dept Psychol, Clemson, SC 29631 USA
关键词
Calibration; Vision; Distance perception; Reaching; Tool use; Embodied action schema; MONOCULAR DISTANCE PERCEPTION; INERTIA TENSOR; SIZE; LENGTH; REPRESENTATION; PERTURBATION; NECESSITY; ABILITY; TASK; HAND;
D O I
10.1016/j.actpsy.2017.09.014
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
In studying human perception and performance researchers must understand how the body schema is modified to accurately represent one's capabilities when tools are used, as humans use tools that alter their capabilities frequently. The present work tested the idea that calibration is responsible for modifying an embodied action schema during tool use. We investigated calibration in the context of manual activity in near space through a behavioral measure. Participants made blind reaches to various visual distances in pre- and post-test phases using a short tool that did not extend their reach. During an intervening calibration phase they received visual feedback about the accuracy of their reaches, with half of the participants reaching with a tool that extended their reach by 30 cm. Results indicated both groups showed calibration appropriate to the type of tool that they used during the calibration phase, and this calibration carried over to reaches made in the post-test. These results inform discussions on the proposed embodied action schema and have applications to virtual reality, specifically the development of self-avatars.
引用
收藏
页码:27 / 39
页数:13
相关论文
共 50 条
  • [41] SCALING OF THE METRICS OF VISUALLY-GUIDED ARM MOVEMENTS DURING MOTOR LEARNING IN PRIMATES
    OJAKANGAS, CL
    EBNER, TJ
    EXPERIMENTAL BRAIN RESEARCH, 1991, 85 (02) : 314 - 323
  • [42] A NEURAL MODEL OF VISUALLY-GUIDED NAVIGATION IN A CLUTTERED WORLD
    Browning, N. Andrew
    Grossberg, Stephen
    Mingolla, Ennio
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 3540 - 3541
  • [43] PLAYBOT - A visually-guided robot for physically disabled children
    Tsotsos, JK
    Verghese, G
    Dickinson, S
    Jenkin, M
    Jepson, A
    Milios, E
    Nuflo, F
    Stevenson, S
    Black, M
    Metaxas, D
    Culhane, S
    Ye, Y
    Mann, R
    IMAGE AND VISION COMPUTING, 1998, 16 (04) : 275 - 292
  • [44] Characterization of visually-guided behaviors by the nudibranch, Berghia stephanieae
    Quinlan, P. D.
    Cho, A. K.
    Katz, P. S.
    INTEGRATIVE AND COMPARATIVE BIOLOGY, 2021, 61 : E725 - E725
  • [45] Auditory contributions to visually-guided eye and hand movements
    Schroeger, Anna
    Kreyenmeier, Philipp
    Raab, Markus
    Canal-Bruland, Rouwen
    Spering, Miriam
    PERCEPTION, 2022, 51 : 78 - 78
  • [46] Neural model of visually-guided navigation in a cluttered world
    Mingolla, E.
    PERCEPTION, 2009, 38 : 2 - 2
  • [47] Locomotion and visually-guided behavior in salamander: a neuromechanical study
    Ijspeert, AJ
    Arbib, R
    SENSOR FUSION AND DECENTRALIZED CONTROL IN ROBOTIC SYSTEMS III, 2000, 4196 : 62 - 71
  • [48] COGNITIVE MAPS AND MOTOR PROGRAMS IN VISUALLY-GUIDED LOCOMOTION
    THOMSON, JA
    PERCEPTION, 1982, 11 (01) : A34 - A34
  • [49] Kinematics and the neurophysiological study of visually-guided eye movements
    Goffart, Laurent
    MATHEMATICAL MODELLING IN MOTOR NEUROSCIENCE: STATE OF THE ART AND TRANSLATION TO THE CLINIC. GAZE ORIENTING MECHANISMS AND DISEASE, 2019, 249 : 375 - 384
  • [50] Natural landmark detection for visually-guided robot navigation
    Celaya, Enric
    Albarral, Jose-Luis
    Jimenez, Pablo
    Torras, Carme
    AI(ASTERISK)IA 2007: ARTIFICIAL INTELLIGENCE AND HUMAN-ORIENTED COMPUTING, 2007, 4733 : 555 - 566