Krishiq-BERT: A Few-Shot Setting BERT Model to Answer Agricultural-Related Questions in the Kannada Language

被引:1
|
作者
Ajawan P. [1 ]
Desai V. [1 ]
Kale S. [1 ]
Patil S. [1 ]
机构
[1] Department of ECE, KLSGIT, Karnataka, Belagavi
关键词
Agriculture; DistilBERT; Few-shot setting; IndicBERT; Krishiq-BERT; NLP; Question and answering; Reader; Retriever;
D O I
10.1007/s40031-023-00952-6
中图分类号
学科分类号
摘要
The integration of data-driven technologies with traditional agricultural techniques has the potential to alter farming practices. It is vital to have precise and trustworthy data in order to use data-driven technologies in the Indian agriculture sector. Currently, there is a lack of a structured and unified source from which farmers can get trustworthy responses to their agricultural queries. Taking this challenge into account, in this paper we introduce (a) Krishiq Agricultural dataset and (b) Krishiq-BERT. The Krishiq Agricultural dataset is a structured Kannada language document curated from the crop information obtained from the University of Agricultural Sciences, Dharwad. Information on 24 widely cultivated crops along with information about the government schemes and practices for farmers is present in the Krishiq Agricultural dataset. Krishiq-BERT is a Kannada language few-shot setting closed domain question answering system capable of providing answer spans to agricultural queries. Krishiq-BERT consists of a retriever–reader system. The retriever utilized is a Haystack Elasticsearch retriever that filters down 10 relevant documents based on the question asked. The reader is a fine-tuned HuggingFace DistilBERT model. The reader searches through the topmost relevant document out of the k filtered documents provided by the retriever to give the final answer span. For the agricultural domain question-answering task in a few-shot setting, the Krishiq-BERT model achieves an average F1 score of 61.16. © The Institution of Engineers (India) 2024.
引用
收藏
页码:285 / 296
页数:11
相关论文
共 5 条
  • [1] Few-shot relation classification based on the BERT model, hybrid attention and fusion networks
    Yibing Li
    Zenghui Ding
    Zuchang Ma
    Yichen Wu
    Yu Wang
    Ruiqi Zhang
    Fei Xie
    Xiaoye Ren
    Applied Intelligence, 2023, 53 : 21448 - 21464
  • [2] Few-shot relation classification based on the BERT model, hybrid attention and fusion networks
    Li, Yibing
    Ding, Zenghui
    Ma, Zuchang
    Wu, Yichen
    Wang, Yu
    Zhang, Ruiqi
    Xie, Fei
    Ren, Xiaoye
    APPLIED INTELLIGENCE, 2023, 53 (18) : 21448 - 21464
  • [3] An angular shrinkage BERT model for few-shot relation extraction with none-of-the-above detection
    Wang, Junwen
    Gao, Yongbin
    Fang, Zhijun
    PATTERN RECOGNITION LETTERS, 2023, 166 : 151 - 158
  • [4] Few-shot Learning for Named Entity Recognition Based on BERT and Two-level Model Fusion
    Gong, Yuan
    Mao, Lu
    Li, Changliang
    DATA INTELLIGENCE, 2021, 3 (04) : 568 - 577
  • [5] Few-shot Learning for Named Entity Recognition Based on BERT and Two-level Model Fusion
    Yuan Gong
    Lu Mao
    Changliang Li
    Data Intelligence, 2021, (04) : 568 - 577