Enhanced YOLOv5: An Efficient Road Object Detection Method

被引:13
|
作者
Chen, Hao [1 ]
Chen, Zhan [1 ]
Yu, Hang [1 ]
机构
[1] Tianjin Chengjian Univ, Sch Comp & Informat Engn, Tianjin 300384, Peoples R China
关键词
intelligent traffic; enhanced YOLOv5; multi-scale; road object detection;
D O I
10.3390/s23208355
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Accurate identification of road objects is crucial for achieving intelligent traffic systems. However, developing efficient and accurate road object detection methods in complex traffic scenarios has always been a challenging task. The objective of this study was to improve the target detection algorithm for road object detection by enhancing the algorithm's capability to fuse features of different scales and levels, thereby improving the accurate identification of objects in complex road scenes. We propose an improved method called the Enhanced YOLOv5 algorithm for road object detection. By introducing the Bidirectional Feature Pyramid Network (BiFPN) into the YOLOv5 algorithm, we address the challenges of multi-scale and multi-level feature fusion and enhance the detection capability for objects of different sizes. Additionally, we integrate the Convolutional Block Attention Module (CBAM) into the existing YOLOv5 model to enhance its feature representation capability. Furthermore, we employ a new non-maximum suppression technique called Distance Intersection Over Union (DIOU) to effectively address issues such as misjudgment and duplicate detection when significant overlap occurs between bounding boxes. We use mean Average Precision (mAP) and Precision (P) as evaluation metrics. Finally, experimental results on the BDD100K dataset demonstrate that the improved YOLOv5 algorithm achieves a 1.6% increase in object detection mAP, while the P value increases by 5.3%, effectively improving the accuracy and robustness of road object recognition.
引用
收藏
页数:21
相关论文
共 50 条
  • [31] DDVC-YOLOv5: An Improved YOLOv5 Model for Road Defect Detection
    Zhong, Shihao
    Chen, Chunlin
    Luo, Wensheng
    Chen, Siyuan
    IEEE ACCESS, 2024, 12 : 134008 - 134019
  • [32] Improved YOLOv5 Method for Fall Detection
    Peng, Jun
    He, Yuanmin
    Jin, Shangzhu
    Dai, Haojun
    Peng, Fei
    Zhang, Yuhao
    2022 IEEE 17TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2022, : 504 - 509
  • [33] YOLOv4 Vs YOLOv5: Object Detection on Surveillance Videos
    Mohod, Nikita
    Agrawal, Prateek
    Madaan, Vishu
    ADVANCED NETWORK TECHNOLOGIES AND INTELLIGENT COMPUTING, ANTIC 2022, PT II, 2023, 1798 : 654 - 665
  • [34] Enhanced Multi-scale Target Detection Method Based on YOLOv5
    Hui K.
    Yang W.
    Liu H.
    Zhang Z.
    Zheng J.
    Bai X.
    Binggong Xuebao/Acta Armamentarii, 2023, 44 (09): : 2600 - 2610
  • [35] An Improved YOLOv5 Algorithm for Vulnerable Road User Detection
    Yang, Wei
    Tang, Xiaolin
    Jiang, Kongming
    Fu, Yang
    Zhang, Xinling
    SENSORS, 2023, 23 (18)
  • [36] Lightweight Road Damage Detection Network Based on YOLOv5
    Zhao, Jingwei
    Tao, Ye
    Zhang, Zhixian
    Huang, Chao
    Cui, Wenhua
    ENGINEERING LETTERS, 2024, 32 (08) : 1708 - 1720
  • [37] Improved YOLOv5 Detection Algorithm of Road Disease Image
    Yang, Jie
    Jia, Xinyu
    Guo, Xiaoyan
    Zang, Chuanyan
    Zhao, Haiyan
    Sun, Jiacheng
    Xu, Yan
    2023 IEEE 6th International Conference on Pattern Recognition and Artificial Intelligence, PRAI 2023, 2023, : 193 - 198
  • [38] A Workpiece-Dense Scene Object Detection Method Based on Improved YOLOv5
    Liu, Jiajia
    Zhang, Shun
    Ma, Zhongli
    Zeng, Yuehan
    Liu, Xueyin
    ELECTRONICS, 2023, 12 (13)
  • [39] An Efficient and Intelligent Detection Method for Fabric Defects based on Improved YOLOv5
    Lin, Guijuan
    Liu, Keyu
    Xia, Xuke
    Yan, Ruopeng
    SENSORS, 2023, 23 (01)
  • [40] Object Detection Method in Open-pit Mine Based on Improved YOLOv5
    Qin X.
    Huang Q.
    Chang D.
    Liu J.
    Hu M.
    Xu B.
    Xie G.
    Hunan Daxue Xuebao/Journal of Hunan University Natural Sciences, 2023, 50 (02): : 23 - 30