Separability-based multiscale basis selection and feature extraction for signal and image classification

被引:62
|
作者
Etemad, K [1 ]
Chellappa, R
机构
[1] Hughes Network Syst Inc, Germantown, MD 20876 USA
[2] Univ Maryland, Dept Elect Engn, College Pk, MD 20742 USA
[3] Univ Maryland, Ctr Automat Res, College Pk, MD 20742 USA
关键词
basis selection; dimensionality reduction; document images; radar signatures; segmentation; separability; textures; wavelet packets;
D O I
10.1109/83.718485
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Algorithms for multiscale basis selection and feature extraction for pattern classification problems are presented. The basis selection algorithm is based on class separability measures rather than energy or entropy. At each level the "accumulated" tree-structured class separabilities obtained from the tree which includes a parent node and the one which includes its children are compared. The decomposition of the node (or subband) is performed (creating the children), if it provides larger combined separability. The suggested feature extraction algorithm focuses on dimensionality reduction of a multiscale feature space subject to maximum preservation of information useful for classification. At each level of decomposition, an optimal linear transform that preserves class separabilities and results in a reduced dimensional feature space is obtained. Classification and feature extraction is then performed at each scale and resulting "soft decisions" obtained for each area are integrated across scales. The suggested algorithms have been tested for classification and segmentation of one-dimensional (1-D) radar signals and two-dimensional (2-D) texture and document images. The same idea can be used for other tree structured local basis, e.g., local trigonometric basis functions, and even for nonorthogonal, redundant and composite basis dictionaries.
引用
收藏
页码:1453 / 1465
页数:13
相关论文
共 50 条
  • [31] Maximum Relevance and Class Separability for Hyperspectral Feature Selection and Classification
    Jahanshahi, Saeed
    2016 IEEE 10TH INTERNATIONAL CONFERENCE ON APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES (AICT), 2016, : 202 - 205
  • [32] Feature Selection Based on Relaxed Linear Separability
    Bobrowski, Leon
    Lukaszuk, Tomasz
    BIOCYBERNETICS AND BIOMEDICAL ENGINEERING, 2009, 29 (02) : 43 - 58
  • [33] Feature selection based on the circle window in image classification
    Zhang, Xiang
    Xiao, Xiaoling
    DCABES 2007 Proceedings, Vols I and II, 2007, : 1072 - 1074
  • [34] Iris feature extraction and matching based on multiscale and directional image representation
    Park, CH
    Lee, JJ
    Oh, SK
    Song, YC
    Choi, DH
    Park, KH
    SCALE SPACE METHODS IN COMPUTER VISION, PROCEEDINGS, 2003, 2695 : 576 - 583
  • [35] A clustering-based feature selection via feature separability
    Jiang, Shengyi
    Wang, Lianxi
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2016, 31 (02) : 927 - 937
  • [36] Adaptive Spectral-Spatial Multiscale Contextual Feature Extraction for Hyperspectral Image Classification
    Wang, Di
    Du, Bo
    Zhang, Liangpei
    Xu, Yonghao
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2021, 59 (03): : 2461 - 2477
  • [37] A Novel Multiscale Attention Feature Extraction Block for Aerial Remote Sensing Image Classification
    Sitaula C.
    Aryal J.
    Bhattacharya A.
    IEEE Geoscience and Remote Sensing Letters, 2023, 20
  • [38] On local feature extraction for signal classification
    Saito, N.
    Coifman, R.R.
    Zeitschrift fuer Angewandte Mathematik und Mechanik, ZAMM, Applied Mathematics and Mechanics, 76 (Suppl 2):
  • [39] FEATURE EXTRACTION AND CLASSIFICATION OF EEG SIGNAL
    Padhy, Prabin Kumar
    Kumar, Avinash
    Chandra, Vivek
    Thumula, Kalyan Rao
    2011 INTERNATIONAL CONFERENCE ON MECHANICAL ENGINEERING AND TECHNOLOGY (ICMET 2011), 2011, : 237 - 240
  • [40] Consecutive multiscale feature learning-based image classification model
    Bekhzod Olimov
    Barathi Subramanian
    Rakhmonov Akhrorjon Akhmadjon Ugli
    Jea-Soo Kim
    Jeonghong Kim
    Scientific Reports, 13 (1)