Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot

被引:93
|
作者
Yu, Yang [1 ]
Zhang, Kailiang [1 ]
Liu, Hui [1 ]
Yang, Li [1 ]
Zhang, Dongxing [1 ]
机构
[1] China Agr Univ, Coll Engn, Beijing 100083, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷 / 08期
基金
中国国家自然科学基金;
关键词
Ridge-planting; harvesting robot; R-YOLO; fruit detection; rotated bounding box; CONVOLUTIONAL NETWORKS; FIELD-EVALUATION; APPLE DETECTION; VISION; FRUIT; RECOGNITION; SYSTEM;
D O I
10.1109/ACCESS.2020.3003034
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
At present, the primary technical deterrent to the use of strawberry harvesting robots is the low harvest rate, and there is a need to improve the accuracy and real-time performance of the localization algorithms to detect the picking point on the strawberry stem. The pose estimation of the fruit target (the direction of the fruit axis) can improve the accuracy of the localization algorithm. This study proposes a novel harvesting robot for the ridge-planted strawberries as well as a fruit pose estimator called rotated YOLO (R-YOLO), which significantly improves the localization precision of the picking points. First, the lightweight network Mobilenet-V1 was used to replace the convolution neural network as the backbone network for feature extraction. The simplified network structure substantially increased the operating speed. Second, the rotation angle parameter alpha was used to label the training set and set the anchors; the rotation of the bounding boxes of the target fruits was predicted using logistic regression with the rotated anchors. The test results of a set of 100 strawberry images showed that the proposed model's average recognition rate to be 94.43% and the recall rate to be 93.46%. Eighteen frames per second (FPS) were processed on the embedded controller of the robot, demonstrating good real-time performance. Compared with several other target detection methods used for the fruit harvesting robots, the proposed model exhibited better performance in terms of real-time detection and localization accuracy of the picking points. Field test results showed that the harvesting success rate reached 84.35% in modified situations. The results of this study provide technical support for improving the target detection of the embedded controller of harvesting robots.
引用
收藏
页码:116556 / 116568
页数:13
相关论文
共 50 条
  • [21] Real-Time Recognition and Localization Based on Improved YOLOv5s for Robot's Picking Clustered Fruits of Chilies
    Zhang, Song
    Xie, Mingshan
    SENSORS, 2023, 23 (07)
  • [22] Development of a real-time machine vision system for the apple harvesting robot
    Bulanon, DM
    Kataoka, T
    Okamoto, H
    Hata, S
    SICE 2004 ANNUAL CONFERENCE, VOLS 1-3, 2004, : 595 - 598
  • [23] Real-time Tree Model Reconstructing for Fruit Harvesting Robot System
    Zhang, Wenli
    Li, Yongping
    9TH INTERNATIONAL CONFERENCE ON COMPUTER-AIDED INDUSTRIAL DESIGN & CONCEPTUAL DESIGN, VOLS 1 AND 2, 2008, : 580 - 584
  • [24] Real-time control of a mobile robot by using visual stimuli
    Istituto Elaborazione Segnali ed, Immagini - C.N.R., Bari, Italy
    Proc IEEE Int Conf Rob Autom, (1665-1670):
  • [25] REAL-TIME VISUAL PERCEPTION FOR TERRAIN MAPPING IN A WALKING ROBOT
    Labecki, Przemyslaw
    Skrzypczynski, Piotr
    ADAPTIVE MOBILE ROBOTICS, 2012, : 754 - 761
  • [26] Real-time control of a mobile robot by using visual stimuli
    Ancona, N
    Branca, A
    ICRA '99: IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, PROCEEDINGS, 1999, : 1665 - 1670
  • [27] Real-time extraction of colored segments for robot visual navigation
    López-de-Teruel, PE
    Ruiz, A
    García-Mateos, G
    García, JM
    COMPUTER VISION SYSTEMS, PROCEEDINGS, 2003, 2626 : 428 - 437
  • [28] Real-time visual workspace localisation and mapping for a wearable robot
    Davison, AJ
    Mayol, WW
    Murray, DW
    SECOND IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, PROCEEDINGS, 2003, : 315 - 316
  • [29] Map learning and real-time vehicle localization for visual navigation
    Hu, Zhencheng
    Wang, Chenhao
    Uchiimura, Keiichi
    2007 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS, 2007, : 28 - 34
  • [30] Real-Time Visual-Based Localization for Mobile Robot Using Structured-View Deep Learning
    Hong, Yi-Feng
    Chang, Yu-Ming
    Li, Chih-Hung G.
    2019 IEEE 15TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2019, : 1353 - 1358