Real-time plant phenomics under robotic farming setup: A vision-based platform for complex plant phenotyping tasks

被引:11
|
作者
Arunachalam, Ajay [1 ]
Andreasson, Henrik [1 ]
机构
[1] Orebro Univ, Ctr Appl Autonomous Sensor Syst AASS, Orebro, Sweden
关键词
Phenotype; Image processing; Spectral; Robotics; Object localization; Precision agriculture; Plant science; Pattern recognition; Computer vision; Automation; Perception; LEAF PIGMENT CONTENT; SYSTEM; GROWTH;
D O I
10.1016/j.compeleceng.2021.107098
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Plant phenotyping in general refers to quantitative estimation of the plant's anatomical, ontogenetical, physiological and biochemical properties. Analyzing big data is challenging, and non-trivial given the different complexities involved. Efficient processing and analysis pipelines are the need of the hour with the increasing popularity of phenotyping technologies and sensors. Through this work, we largely address the overlapping object segmentation & localization problem. Further, we dwell upon multi-plant pipelines that pose challenges as detection and multi-object tracking becomes critical for single frame/set of frames aimed towards uniform tagging & visual features extraction. A plant phenotyping tool named RTPP (Real-Time Plant Phenotyping) is presented that can aid in the detection of single/multi plant traits, modeling, and visualization for agricultural settings. We compare our system with the plantCV platform. The relationship of the digital estimations, and the measured plant traits are discussed that plays a vital roadmap towards precision farming and/or plant breeding.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Deep Plant Phenomics: A Deep Learning Platform for Complex Plant Phenotyping Tasks
    Ubbens, Jordan R.
    Stavness, Ian
    FRONTIERS IN PLANT SCIENCE, 2017, 8
  • [2] Deep Plant Phenomics: A Deep Learning Platform for Complex Plant Phenotyping Tasks (vol 8, 1190, 2017)
    Ubbens, Jordan R.
    Stavness, Ian
    FRONTIERS IN PLANT SCIENCE, 2018, 8
  • [3] A Telemetric, Gravimetric Platform for Real-Time Physiological Phenotyping of Plant Environment Interactions
    Dalal, Ahan
    Shenhar, Itamar
    Bourstein, Ronny
    Mayo, Amir
    Grunwald, Yael
    Averbuch, Nir
    Attia, Ziv
    Wallach, Rony
    Moshelion, Menachem
    JOVE-JOURNAL OF VISUALIZED EXPERIMENTS, 2020, (162): : 1 - 28
  • [4] An effective strategy of real-time vision-based control for a Stewart platform
    Rossell, Josep M.
    Vicente-Rodrigo, Jesus
    Rubio-Massegu, Josep
    Barcons, Victor
    2018 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT), 2018, : 75 - 80
  • [5] A real-time computer vision-based platform for fabric inspection part 2: platform design and real-time implementation
    Zhou, Jian
    Li, Guanzhi
    Wan, Xianfu
    Wang, Jun
    JOURNAL OF THE TEXTILE INSTITUTE, 2016, 107 (02) : 264 - 272
  • [6] Vision-based SLAM in real-time
    Davison, Andrew J.
    Pattern Recognition and Image Analysis, Pt 1, Proceedings, 2007, 4477 : 9 - 12
  • [7] A study on vision-based real-time seam tracking in robotic arc welding
    Shen, H. Y.
    Lin, T.
    Chen, S. B.
    ROBOTIC WELDING, INTELLIGENCE AND AUTOMATION, 2007, 362 : 311 - +
  • [8] Vision-based integrated mobile robotic system for real-time applications in construction
    Asadi, Khashayar
    Ramshankar, Hariharan
    Pullagurla, Harish
    Bhandare, Aishwarya
    Shanbhag, Suraj
    Mehta, Pooja
    Kundu, Spondon
    Han, Kevin
    Lobaton, Edgar
    Wu, Tianfu
    AUTOMATION IN CONSTRUCTION, 2018, 96 : 470 - 482
  • [9] Real-time vision-based object tracking from a moving platform in the air
    Ding, Wei
    Gong, Zhenbang
    Xie, Shaorong
    Zou, Hairong
    2006 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-12, 2006, : 681 - +
  • [10] Vision-Based Deep Learning Approach for Real-Time Detection of Weeds in Organic Farming
    Czymmek, Vitali
    Harders, Leif O.
    Knoll, Florian J.
    Hussmann, Stephan
    2019 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE (I2MTC), 2019, : 585 - 589