A Transformer-based Prior Legal Case Retrieval Method

被引:0
|
作者
Ozturk, Ceyhun E. [1 ]
Ozcelik, S. Baris [2 ]
Koc, Aykut [3 ]
机构
[1] Bilkent Univ, Elekt & Elekt Muhendisligi Bolumu, ASELSAN Res Ctr, Ankara, Turkiye
[2] Bilkent Univ, Hukuk Fak, Ankara, Turkiye
[3] Bilkent Univ, Elekt & Elekt Muhendisligi Bolumu, Ulusal Manyet Rezonans Arastirma Merkezi, Ankara, Turkiye
关键词
Natural language processing; legal tech; deep learning; prior legal case retrieval; legal NLP; Turkish NLP;
D O I
10.1109/SIU59756.2023.10223938
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, BERTurk-Legal, a transformer-based language model, is introduced to retrieve prior legal cases. BERTurk-Legal is pre-trained on a dataset from the Turkish legal domain. This dataset does not contain any labels related to the prior court case retrieval task. Masked language modeling is used to train BERTurk-Legal in a self-supervised manner. With zero-shot classification, BERTurk-Legal provides state-of-the-art results on the dataset consisting of legal cases of the Court of Cassation of Turkey. The results of the experiments show the necessity of developing language models specific to the Turkish law domain.
引用
收藏
页数:4
相关论文
共 50 条
  • [41] Docalog: Multi-document Dialogue System using Transformer-based Span Retrieval
    Alavian, Sayed Hesam
    Satvaty, Ali
    Sabouri, Sadra
    Asgari, Ehsaneddin
    Sameti, Hossein
    PROCEEDINGS OF THE SECOND DIALDOC WORKSHOP ON DOCUMENT-GROUNDED DIALOGUE AND CONVERSATIONAL QUESTION ANSWERING (DIALDOC 2022), 2022, : 142 - 147
  • [42] ShapeFormer: Shape Prior Visible-to-Amodal Transformer-based Amodal Instance Segmentation
    Tran, Minh (minht@uark.edu), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [43] Transformer-Based Learned Optimization
    Gartner, Erik
    Metz, Luke
    Andriluka, Mykhaylo
    Freeman, C. Daniel
    Sminchisescu, Cristian
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 11970 - 11979
  • [44] Transformer-based Image Compression
    Lu, Ming
    Guo, Peiyao
    Shi, Huiqing
    Cao, Chuntong
    Ma, Zhan
    DCC 2022: 2022 DATA COMPRESSION CONFERENCE (DCC), 2022, : 469 - 469
  • [45] Transformer-Based Microbubble Localization
    Gharamaleki, Sepideh K.
    Helfield, Brandon
    Rivaz, Hassan
    2022 IEEE INTERNATIONAL ULTRASONICS SYMPOSIUM (IEEE IUS), 2022,
  • [46] Transformer-Based Discriminative and Strong Representation Deep Hashing for Cross-Modal Retrieval
    Zhou, Suqing
    Han, Yu
    Chen, Ning
    Huang, Siyu
    Igorevich, Kostromitin Konstantin
    Luo, Jia
    Zhang, Peiying
    IEEE ACCESS, 2023, 11 : 140041 - 140055
  • [47] A Hybrid Transformer-Based Framework for Multi-Document Summarization of Turkish Legal Documents
    Albayati, Maha Ahmed Abdullah
    Findik, Oguz
    IEEE ACCESS, 2025, 13 : 37165 - 37181
  • [48] Transformer-Based Approaches for Legal Text Processing JNLP Team-COLIEE 2021
    Nguyen, Ha-Thanh
    Nguyen, Minh-Phuong
    Vuong, Thi-Hai-Yen
    Bui, Minh-Quan
    Nguyen, Minh-Chau
    Dang, Tran-Binh
    Tran, Vu
    Nguyen, Le-Minh
    Satoh, Ken
    REVIEW OF SOCIONETWORK STRATEGIES, 2022, 16 (01): : 135 - 155
  • [49] A Novel Transformer-Based Object Detection Method With Geometric and Object Co-Occurrence Prior Knowledge for Remote Sensing Images
    Mo, Nan
    Zhu, Ruixi
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2025, 18 : 2383 - 2400
  • [50] Joint Learning Method Based On Transformer For Image Retrieval
    Wei, Hongxi
    He, Chao
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,