Autonomous Imaging and Mapping of Small Bodies Using Deep Reinforcement Learning

被引:20
|
作者
Chan, David M. [1 ]
Agha-mohammadi, Ali-akbar [2 ]
机构
[1] Univ Calif Berkeley, 253 Cory Hall, Berkeley, CA 94720 USA
[2] CALTECH, Jet Prop Lab, 4800 Oak Grove Dr, Pasadena, CA 91109 USA
基金
美国国家航空航天局;
关键词
MARKOV-PROCESSES;
D O I
10.1109/aero.2019.8742147
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
The mapping and navigation around small unknown bodies continues to be an extremely interesting and exciting problem in the area of space exploration. Traditionally, the spacecraft trajectory for mapping missions is designed by human experts using hundreds of hours of human time to supervise the navigation and orbit selection process. While the current methodology has performed adequately for previous missions (such as Rosetta, Hayabusa and Deep Space), as the demands for mapping missions expand, additional autonomy during the mapping and navigation process will become necessary for mapping spacecraft. In this work we provide the framework for optimizing the autonomous imaging and mapping problem as a Partially Observable Markov Decision Process (POMDP). In addition, we introduce a new simulation environment which simulates the orbital mapping of small bodies and demonstrate that policies trained with our POMDP formulation are able to maximize map quality while autonomously selecting orbits and supervising imaging tasks. We conclude with a discussion of integrating Deep Reinforcement Learning modules with classic flight software systems, and some of the challenges that could be encountered when using Deep RL in flight-ready systems.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Deep Reinforcement Learning-based policy for autonomous imaging planning of small celestial bodies mapping
    Piccinin, Margherita
    Lunghi, Paolo
    Lavagna, Michele
    AEROSPACE SCIENCE AND TECHNOLOGY, 2022, 120
  • [2] Integrating Neural Radiance Fields With Deep Reinforcement Learning for Autonomous Mapping of Small Bodies
    Wei, Xiaodong
    Cui, Linyan
    Liu, Chuankai
    Yin, Jihao
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [3] Autonomous imaging scheduling networks of small celestial bodies flyby based on deep reinforcement learning
    Hang Hu
    Weiren Wu
    Yuqi Song
    Wenjian Tao
    Jianing Song
    Jinxiu Zhang
    Jihe Wang
    Complex & Intelligent Systems, 2024, 10 : 3181 - 3195
  • [4] Autonomous imaging scheduling networks of small celestial bodies flyby based on deep reinforcement learning
    Hu, Hang
    Wu, Weiren
    Song, Yuqi
    Tao, Wenjian
    Song, Jianing
    Zhang, Jinxiu
    Wang, Jihe
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (03) : 3181 - 3195
  • [5] Autonomous Highway Driving using Deep Reinforcement Learning
    Nageshrao, Subramanya
    Tseng, H. Eric
    Filev, Dimitar
    2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 2326 - 2331
  • [6] ON THE DEVELOPMENT OF AUTONOMOUS AGENTS USING DEEP REINFORCEMENT LEARNING
    Barbu, Clara
    Mocanu, Stefan Alexandru
    UNIVERSITY POLITEHNICA OF BUCHAREST SCIENTIFIC BULLETIN SERIES C-ELECTRICAL ENGINEERING AND COMPUTER SCIENCE, 2021, 83 (03): : 97 - 116
  • [7] Molecular Autonomous Pathfinder Using Deep Reinforcement Learning
    Nomura, Ken-ichi
    Mishra, Ankit
    Sang, Tian
    Kalia, Rajiv K.
    Nakano, Aiichiro
    Vashishta, Priya
    JOURNAL OF PHYSICAL CHEMISTRY LETTERS, 2024, 15 (19): : 5288 - 5294
  • [8] On the development of autonomous agents using deep reinforcement learning
    Barbu, Clara
    Mocanu, Ștefan Alexandru
    UPB Scientific Bulletin, Series C: Electrical Engineering and Computer Science, 2021, 83 (03): : 97 - 116
  • [9] Deep reinforcement learning in autonomous manipulation for celestial bodies exploration: Applications and challenges
    Gao X.
    Tang L.
    Huang H.
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2023, 44 (06):
  • [10] Autonomous Navigation & Mapping of Small Bodies
    Pesce, Vincenzo
    Agha-mohammadi, Ali-akbar
    Lavagna, Michele
    2018 IEEE AEROSPACE CONFERENCE, 2018,