Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot

被引:93
|
作者
Yu, Yang [1 ]
Zhang, Kailiang [1 ]
Liu, Hui [1 ]
Yang, Li [1 ]
Zhang, Dongxing [1 ]
机构
[1] China Agr Univ, Coll Engn, Beijing 100083, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷 / 08期
基金
中国国家自然科学基金;
关键词
Ridge-planting; harvesting robot; R-YOLO; fruit detection; rotated bounding box; CONVOLUTIONAL NETWORKS; FIELD-EVALUATION; APPLE DETECTION; VISION; FRUIT; RECOGNITION; SYSTEM;
D O I
10.1109/ACCESS.2020.3003034
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
At present, the primary technical deterrent to the use of strawberry harvesting robots is the low harvest rate, and there is a need to improve the accuracy and real-time performance of the localization algorithms to detect the picking point on the strawberry stem. The pose estimation of the fruit target (the direction of the fruit axis) can improve the accuracy of the localization algorithm. This study proposes a novel harvesting robot for the ridge-planted strawberries as well as a fruit pose estimator called rotated YOLO (R-YOLO), which significantly improves the localization precision of the picking points. First, the lightweight network Mobilenet-V1 was used to replace the convolution neural network as the backbone network for feature extraction. The simplified network structure substantially increased the operating speed. Second, the rotation angle parameter alpha was used to label the training set and set the anchors; the rotation of the bounding boxes of the target fruits was predicted using logistic regression with the rotated anchors. The test results of a set of 100 strawberry images showed that the proposed model's average recognition rate to be 94.43% and the recall rate to be 93.46%. Eighteen frames per second (FPS) were processed on the embedded controller of the robot, demonstrating good real-time performance. Compared with several other target detection methods used for the fruit harvesting robots, the proposed model exhibited better performance in terms of real-time detection and localization accuracy of the picking points. Field test results showed that the harvesting success rate reached 84.35% in modified situations. The results of this study provide technical support for improving the target detection of the embedded controller of harvesting robots.
引用
收藏
页码:116556 / 116568
页数:13
相关论文
共 50 条
  • [41] Real-Time Seam Tracking Technology of Welding Robot with Visual Sensing
    Shen, Hongyuan
    Lin, Tao
    Chen, Shanben
    Li, Laiping
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2010, 59 (3-4) : 283 - 298
  • [42] System for real-time mobile robot control through visual telepresence
    Zalud, L.
    Annals of DAAAM for 2004 & Proceedings of the 15th International DAAAM Symposium: INTELLIGNET MANUFACTURING & AUTOMATION: GLOBALISATION - TECHNOLOGY - MEN - NATURE, 2004, : 495 - 496
  • [43] Real-Time Recognition and Localization of Apples for Robotic Picking Based on Structural Light and Deep Learning
    Zhang, Quan
    Su, Wen-Hao
    SMART CITIES, 2023, 6 (06): : 3393 - 3410
  • [44] Cascaded CNN for Real-time Tongue Segmentation Based on Key Points Localization
    Yuan, Wei
    Liu, Changsong
    2019 4TH IEEE INTERNATIONAL CONFERENCE ON BIG DATA ANALYTICS (ICBDA 2019), 2019, : 303 - 307
  • [45] Real-Time Visual-Inertial Localization for Aerial and Ground Robots
    Oleynikova, Helen
    Burri, Michael
    Lynen, Simon
    Siegwart, Roland
    2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 3079 - 3085
  • [46] Real-Time Visual Place Recognition for Personal Localization on a Mobile Device
    Michał R. Nowicki
    Jan Wietrzykowski
    Piotr Skrzypczyński
    Wireless Personal Communications, 2017, 97 : 213 - 244
  • [47] Application of a Real-Time Visualization Method of AUVs in Underwater Visual Localization
    Wang, Ran
    Wang, Xin
    Zhu, Mingming
    Lin, Yinfu
    APPLIED SCIENCES-BASEL, 2019, 9 (07):
  • [48] Real-Time Visual Place Recognition for Personal Localization on a Mobile Device
    Nowicki, Michal R.
    Wietrzykowski, Jan
    Skrzypczynski, Piotr
    WIRELESS PERSONAL COMMUNICATIONS, 2017, 97 (01) : 213 - 244
  • [49] Real-time gesture recognition by learning and selective control of visual interest points
    Kirishima, T
    Sato, K
    Chihara, K
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (03) : 351 - 364
  • [50] Real-time Vineyard Trunk Detection for a Grapes Harvesting Robot via Deep Learning
    Badeka, Eftichia
    Kalampokas, Theofanis
    Vrochidou, Eleni
    Tziridis, Konstantinos
    Papakostas, George A.
    Pachidis, Theodore
    Kaburlasos, Vassilis G.
    THIRTEENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2020), 2021, 11605