Patent classification with pre-trained Bert model

被引:0
|
作者
Kahraman, Selen Yuecesoy [1 ]
Durmusoglu, Alptekin [2 ]
Dereli, Tuerkay [3 ]
机构
[1] Gaziantep Univ, Fac Engn, Dept Ind Engn, TR-27310 Sehitkamil, Gaziantep, Turkiye
[2] Samsun Univ, Fac Engn & Nat Sci, Dept Ind Engn, TR-55420 Ondokuzmayis, Samsun, Turkiye
[3] Hasan Kalyoncu Univ, Off President, Sahinbey Gaziantep 27310, Turkiye
关键词
Patent classification; Bert; deep learning; text classification; hyperparameter test;
D O I
10.17341/gazimmfd.1292543
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Patents are documents that help protect innovations in information technologies and grant special rights to the creator of these innovations for a certain period of time. While these rights give the patent owner the right to use the innovation commercially, they prevent others from using the innovation without permission. Radical innovations and ground -breaking technological advances are derived from technical information contained in existing patents. Using an automatic classification system, patents assigned to the technical class to which they belong can pave the way for researchers and provide an environment in which they can create new inventions. This study presents an automatic patent classification analysis using the BERT algorithm. Hyperparameter analyses are also preferred in this study in order to achieve more successful prediction accuracy in automatic patent classification problems. The obtained results were at a level that competed with those in the literature. An accuracy of 58% was achieved at the subclass level.
引用
收藏
页码:2485 / 2496
页数:12
相关论文
共 50 条
  • [1] Leveraging Pre-trained BERT for Audio Captioning
    Liu, Xubo
    Mei, Xinhao
    Huang, Qiushi
    Sun, Jianyuan
    Zhao, Jinzheng
    Liu, Haohe
    Plumbley, Mark D.
    Kilic, Volkan
    Wang, Wenwu
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1145 - 1149
  • [2] Research on Chinese Intent Recognition Based on BERT pre-trained model
    Zhang, Pan
    Huang, Li
    2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020), 2020, : 128 - 132
  • [3] miProBERT: identification of microRNA promoters based on the pre-trained model BERT
    Wang, Xin
    Gao, Xin
    Wang, Guohua
    Li, Dan
    BRIEFINGS IN BIOINFORMATICS, 2023, 24 (03)
  • [4] On Cognitive Level Classification of Assessment-items Using Pre-trained BERT-based Model
    Dipto, Adnan Saif
    Limon, Md. Mahmudur Rahman
    Tuba, Fatima Tanjum
    Uddin, Md Mohsin
    Khan, M. Saddam Hossain
    Tuhin, Rashedul Amin
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2023, 2023, : 245 - 251
  • [5] BERT-siRNA: siRNA target prediction based on BERT pre-trained interpretable model
    Xu, Jiayu
    Xu, Nan
    Xie, Weixin
    Zhao, Chengkui
    Yu, Lei
    Feng, Weixing
    GENE, 2024, 910
  • [6] Session Search with Pre-trained Graph Classification Model
    Ma, Shengjie
    Chen, Chong
    Mao, Jiaxin
    Tian, Qi
    Jiang, Xuhui
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 953 - 962
  • [7] The Lottery Ticket Hypothesis for Pre-trained BERT Networks
    Chen, Tianlong
    Frankle, Jonathan
    Chang, Shiyu
    Liu, Sijia
    Zhang, Yang
    Wang, Zhangyang
    Carbin, Michael
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [8] Modeling essay grading with pre-trained BERT features
    Sharma, Annapurna
    Jayagopi, Dinesh Babu
    APPLIED INTELLIGENCE, 2024, 54 (06) : 4979 - 4993
  • [9] A Comparative Study on Pre-Trained Models Based on BERT
    Zhang, Minghua
    2024 6TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING, ICNLP 2024, 2024, : 326 - 330
  • [10] Sharing Pre-trained BERT Decoder for a Hybrid Summarization
    Wei, Ran
    Huang, Heyan
    Gao, Yang
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 169 - 180