Simulated Visually-Guided Paw Placement During Quadruped Locomotion

被引:1
|
作者
Oliveira, Miguel [1 ]
Santos, Cristina P. [1 ]
Ferreira, Manuel [1 ]
机构
[1] Univ Minho, Sch Engn, Dept Ind Elect, Guimaraes, Portugal
关键词
MODEL;
D O I
10.1109/IECON.2009.5415420
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Autonomous adaptive locomotion over irregular terrain is one important topic in robotics research. In this article, we focus on the development of a quadruped locomotion controller able to generate locomotion and reaching visually acquired markers. The developed controller is modeled as discrete, sensory driven corrections of a basic rhythmic motor pattern for locomotion according to visual information and proprioceptive data, that enables the robot to reach markers and only slightly perturb the locomotion movement. This task involves close-loop control and we will thus particularly focus on the essential issue of modeling the interaction between the central nervous system and the peripheral information in the locomotion context. This issue is crucial for autonomous and adaptive control, and has received little attention so far. Trajectories are online modulated according to these feedback pathways thus achieving paw placement. This modeling is based on the concept of dynamical systems whose intrinsic robustness against perturbations allows for an easy integration of sensory-motor feedback and thus for closed-loop control. The system is demonstrated on a simulated quadruped robot which online acquires the visual markers and achieves paw placement while locomotes.
引用
收藏
页码:2235 / +
页数:2
相关论文
共 50 条
  • [31] PLAYBOT - A visually-guided robot for physically disabled children
    Tsotsos, JK
    Verghese, G
    Dickinson, S
    Jenkin, M
    Jepson, A
    Milios, E
    Nuflo, F
    Stevenson, S
    Black, M
    Metaxas, D
    Culhane, S
    Ye, Y
    Mann, R
    IMAGE AND VISION COMPUTING, 1998, 16 (04) : 275 - 292
  • [32] Numeric comparison in a visually-guided manual reaching task
    Song, Joo-Hyun
    Nakayama, Ken
    COGNITION, 2008, 106 (02) : 994 - 1003
  • [33] Characterization of visually-guided behaviors by the nudibranch, Berghia stephanieae
    Quinlan, P. D.
    Cho, A. K.
    Katz, P. S.
    INTEGRATIVE AND COMPARATIVE BIOLOGY, 2021, 61 : E725 - E725
  • [34] Auditory contributions to visually-guided eye and hand movements
    Schroeger, Anna
    Kreyenmeier, Philipp
    Raab, Markus
    Canal-Bruland, Rouwen
    Spering, Miriam
    PERCEPTION, 2022, 51 : 78 - 78
  • [35] Neural model of visually-guided navigation in a cluttered world
    Mingolla, E.
    PERCEPTION, 2009, 38 : 2 - 2
  • [36] Kinematics and the neurophysiological study of visually-guided eye movements
    Goffart, Laurent
    MATHEMATICAL MODELLING IN MOTOR NEUROSCIENCE: STATE OF THE ART AND TRANSLATION TO THE CLINIC. GAZE ORIENTING MECHANISMS AND DISEASE, 2019, 249 : 375 - 384
  • [37] Natural landmark detection for visually-guided robot navigation
    Celaya, Enric
    Albarral, Jose-Luis
    Jimenez, Pablo
    Torras, Carme
    AI(ASTERISK)IA 2007: ARTIFICIAL INTELLIGENCE AND HUMAN-ORIENTED COMPUTING, 2007, 4733 : 555 - 566
  • [38] Visually-guided behavior of homonymous hemianopes in a naturalistic task
    Martin, Tim
    Riley, Meghan E.
    Kelly, Kristin N.
    Hayhoe, Mary
    Huxlin, Krystel R.
    VISION RESEARCH, 2007, 47 (28) : 3434 - 3446
  • [39] Does Visually-Guided Placement of Contiguous Ablation Lesions Result in Reliable and Persistent Pulmonary Vein Isolation?
    Reddy, Vivek
    Neuzil, Petr
    Doshi, Shephal K.
    Dukkipati, Srinivas
    d'Avila, Andre
    Ahmed, Humer
    Henault, Kathryn
    CIRCULATION, 2009, 120 (18) : S706 - S706
  • [40] Visually-guided grasping while walking on a humanoid robot
    Mansard, Nicolas
    Stasse, Olivier
    Chaumette, Francois
    Yokoi, Kazuhito
    PROCEEDINGS OF THE 2007 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-10, 2007, : 3041 - +