Novel feature extraction of underwater targets by encoding hydro-acoustic signatures as image

被引:4
|
作者
Zare, Mehdi [1 ]
Nouri, Nowrouz Mohammad [1 ]
机构
[1] Iran Univ Sci & Technol, Dept Mech Engn, Tehran, Iran
关键词
Hydro-acoustic signature; Feature extraction; Gramian angular field; Gray level co-occurrence Matrix; Image texture measure; Second-order image statistics; COOCCURRENCE TEXTURE STATISTICS; PERMUTATION ENTROPY; NOISE; CLASSIFICATION; IDENTIFICATION; EQUATION; REPRESENTATION; MODELS;
D O I
10.1016/j.apor.2023.103627
中图分类号
P75 [海洋工程];
学科分类号
0814 ; 081505 ; 0824 ; 082401 ;
摘要
Underwater vessel-radiated acoustical noise (UVRAN) is a major factor for classification in the sea by the SONAR. Due to unsteady and complex maritime ambient, analyzing underwater sound signals is a challenging issue that has lately received attention in the marine field. In the conventional feature extraction methods, to reduce the effect of ocean noise, the de-noising procedure is performed before complexity measurement by mode decomposition techniques. Based on this, we propose a novel insight for the first time to distinguish the objects which made the underwater noises as the hydro-acoustic signature, using a signals-to-image conversion without noise removal. After pre-processing, the spectral amplitude mean difference function is encoded into an image using Gramian angular field (GAF) technique. Subsequently, image texture analysis is performed in which GAF images are subjected to the gray-level co-occurrence matrix (GLCM). Finally, the second-order image statistic (i.e., 2-D permutation entropy) is calculated. Compared with other methods, results demonstrate that the proposed method has a high degree of separation and stability between the various kinds of underwater targets, suggesting that the methodology is superior to the existing methods. Moreover, our model is robust to noise. The approach perhaps opens an alternative path for UVRAN discrimination.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Underwater image feature extraction and matching based on visual saliency detection
    Zhang, Lunjuan
    He, Bo
    Song, Yan
    Yan, Tianhong
    OCEANS 2016 - SHANGHAI, 2016,
  • [32] Feature extraction from time domain acoustic signatures of weapons systems fire
    Yang, Christine
    Goldman, Geoffrey H.
    ACTIVE AND PASSIVE SIGNATURES V, 2014, 9082
  • [33] Multi-scale spectral feature extraction for underwater acoustic target recognition
    Jiang, Junjun
    Shi, Tuo
    Huang, Min
    Xiao, Zhongzhe
    MEASUREMENT, 2020, 166
  • [34] Doppler-Shift Invariant Feature Extraction for Underwater Acoustic Target Classification
    Wang, Lu
    Wang, Qiang
    Zhao, Lifan
    Zeng, Xiangyang
    Bi, Guoan
    2017 2ND IEEE INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS, SIGNAL PROCESSING AND NETWORKING (WISPNET), 2017, : 1209 - 1212
  • [35] Research on feature extraction method for underwater acoustic signal using secondary decomposition
    Li, Guohui
    Liu, Bo
    Yang, Hong
    OCEAN ENGINEERING, 2024, 306
  • [36] Feature Extraction of Underwater Acoustic Signal Target Using Machine Learning Technique
    Ashok, P.
    Latha, B.
    TRAITEMENT DU SIGNAL, 2024, 41 (03) : 1303 - 1314
  • [37] Feature Extraction of Underwater Acoustic Signal Using Mode Decomposition and Measuring Complexity
    Li, Yaan
    Li, Yuxing
    PROCEEDINGS OF 2018 15TH INTERNATIONAL BHURBAN CONFERENCE ON APPLIED SCIENCES AND TECHNOLOGY (IBCAST), 2018, : 757 - 763
  • [38] Research on feature extraction of underwater acoustic signal based on hybrid entropy algorithms
    Yang, Hong
    Wang, Chao
    Li, Guohui
    APPLIED ACOUSTICS, 2025, 235
  • [39] De-noising of underwater acoustic signals based on ICA feature extraction
    Wei, K
    Bin, Y
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS AND APPLICATIONS, PROCEEDINGS, 2005, 3773 : 917 - 924
  • [40] Underwater Acoustic Target Recognition with a Residual Network and the Optimized Feature Extraction Method
    Hong, Feng
    Liu, Chengwei
    Guo, Lijuan
    Chen, Feng
    Feng, Haihong
    APPLIED SCIENCES-BASEL, 2021, 11 (04): : 1 - 12