AstroPose: Astronaut pose estimation using a monocular camera during extravehicular activities

被引:0
|
作者
LIU ZiBin [1 ]
LI You [2 ]
WANG ChunHui [2 ]
LIU Liang [2 ]
GUAN BangLei [1 ]
SHANG Yang [1 ]
YU QiFeng [1 ]
机构
[1] College of Aerospace Science and Engineering,National University of Defense Technology
[2] Key Laboratory of Human Factors Engineering,China Astronaut Research and Training Center
关键词
D O I
暂无
中图分类号
TP391.41 []; V445.8 [照相仪器、设备];
学科分类号
080203 ; 082504 ;
摘要
With the completion of the Chinese space station, an increasing number of extravehicular activities will be executed by astronauts,which is regarded as one of the most dangerous activities in human space exploration. To guarantee the safety of astronauts and the successful accomplishment of missions, it is vital to determine the pose of astronauts during extravehicular activities. This article presents a monocular vision-based pose estimation method of astronauts during extravehicular activities, making full use of the available observation resources. First, the camera is calibrated using objects of known structures, such as the spacesuit backpack or the circular handrail outside the space station. Subsequently, the pose estimation is performed utilizing the feature points on the spacesuit. The proposed methods are validated both on synthetic and semi-physical simulation experiments, demonstrating the high precision of the camera calibration and pose estimation. To further evaluate the performance of the methods in real-world scenarios, we utilize image sequences of Shenzhou-13 astronauts during extravehicular activities. The experiments validate that camera calibration and pose estimation can be accomplished solely with the existing observation resources, without requiring additional complicated equipment. The motion parameters of astronauts lay the technological foundation for subsequent applications such as mechanical analysis, task planning, and ground training of astronauts.
引用
收藏
页码:1933 / 1945
页数:13
相关论文
共 50 条
  • [1] AstroPose: Astronaut pose estimation using a monocular camera during extravehicular activities
    Liu, Zibin
    Li, You
    Wang, Chunhui
    Liu, Liang
    Guan, Banglei
    Shang, Yang
    Yu, Qifeng
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2024, 67 (06) : 1933 - 1945
  • [2] Spacecraft pose estimation using a monocular camera
    1600, International Astronautical Federation, IAF (00):
  • [3] Estimation of Vehicle Pose with Monocular Camera
    Zubov, Ilya G.
    PROCEEDINGS OF THE 2019 IEEE CONFERENCE OF RUSSIAN YOUNG RESEARCHERS IN ELECTRICAL AND ELECTRONIC ENGINEERING (EICONRUS), 2019, : 395 - 397
  • [4] Direct pose estimation with a monocular camera
    Burschka, Darius
    Mair, Elmar
    ROBOT VISION, PROCEEDINGS, 2008, 4931 : 440 - 453
  • [5] SoftPOSIT Enhancements for Monocular Camera Spacecraft Pose Estimation
    Shi, Jian-Feng
    Ulrich, Steve
    2016 21ST INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS (MMAR), 2016, : 30 - 35
  • [6] An anthropomorphic hand exoskeleton to prevent astronaut hand fatigue during extravehicular activities
    Shields, BL
    Main, JA
    Peterson, SW
    Strauss, AM
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 1997, 27 (05): : 668 - 673
  • [7] 3D Head pose estimation and camera mouse implementation using a monocular video camera
    Nabati, Masoomeh
    Behrad, Alireza
    SIGNAL IMAGE AND VIDEO PROCESSING, 2015, 9 (01) : 39 - 44
  • [8] 3D Head pose estimation and camera mouse implementation using a monocular video camera
    Masoomeh Nabati
    Alireza Behrad
    Signal, Image and Video Processing, 2015, 9 : 39 - 44
  • [9] Unsupervised monocular visual odometry with decoupled camera pose estimation
    Lin, Lili
    Wang, Weisheng
    Luo, Wan
    Song, Lesheng
    Zhou, Wenhui
    DIGITAL SIGNAL PROCESSING, 2021, 114
  • [10] An Improved Orthogonal Iterative Algorithm for Monocular Camera Pose Estimation
    Shi, De-cai
    Dong, Xiu-cheng
    Zheng, Yu
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER SCIENCE (AICS 2016), 2016, : 233 - 242