Modulation Recognition of Digital Multimedia Signal Based on Data Feature Selection

Modulation Recognition of Digital Multimedia Signal Based on Data Feature Selection

Hui Wang, Li Li Guo, Yun Lin
DOI: 10.4018/IJMCMC.2017070107
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Automatic modulation recognition is very important for the receiver design in the broadband multimedia communication system, and the reasonable signal feature extraction and selection algorithm is the key technology of Digital multimedia signal recognition. In this paper, the information entropy is used to extract the single feature, which are power spectrum entropy, wavelet energy spectrum entropy, singular spectrum entropy and Renyi entropy. And then, the feature selection algorithm of distance measurement and Sequential Feature Selection(SFS) are presented to select the optimal feature subset. Finally, the BP neural network is used to classify the signal modulation. The simulation result shows that the four-different information entropy can be used to classify different signal modulation, and the feature selection algorithm is successfully used to choose the optimal feature subset and get the best performance.
Article Preview
Top

Introduction

In recent years, automatic modulation recognition is an important research filed of signal analysis and processing for many communication applications. Traditionally, automatic modulation recognition is focusing on the non-cooperative communication system, such as signal monitoring, military communication, and so on. However, in the recent development of broadband multimedia communication and broadcasting system, the potential application of automatic modulation recognition has increased for adaptive transmission system design and power conservation.

The feature is a measurable character of the signal being observed. Using the set of features, any classifier can be used to recognize the modulation type of signal. In the past thirty years, in the applications of pattern recognition, the domain of features has expanded from tens to hundreds of features used in modulation signal recognition. Those redundant and irrelevant features are burden on the modulation recognition; therefore, several technologies have been developed to reduce the dimension of feature, and they are all names as feature selection.

Feature selection is used to select a subset of features from the input set which can be used to efficiently describe the input data, meantime, reducing irrelevant and redundant variables, therefore, it is useful in understanding the signal, improving the recognition performance, reducing the computation requirement and the effect of curse of dimensionality and. In the application of modulation recognition, the input data can contain hundreds of variables of which many of them will be highly correlated with other variables. The dependant variables provide no extra information about the classes and thus serve as interference for the classifier. It means that the total information content can be gotten from fewer unique feature which obtain the maximum discrimination information about the classes. Therefore, through eliminating the dependent variables, the amount of feature can be reduced and the recognition performance can be improved. In other words, through applying feature selection technology, the insight into the recognition process can be gotten, meantime, the computation requirement and recognition accuracy can be further improved.

To eliminate an irrelevant or redundant feature, the feature selection method is required, which can be used to calculate the relevance of each feature with the output class. The process is shown in Figure 1, removing the irrelevant and redundant feature isn’t the same with dimension reduction methods such as Principal Component Analysis (PCA) because the good feature can be independent of the rest of the data (Guyon & Elisseeff, 2003; Guyon, et al., 2002). Feature elimination will not create some new features. When a feature selection method is selected, a procedure must be used to find the subset of useful features. In the work of Chandrashekar and Sahin (2014), the feature elimination is classified into filter and wrapper methods. In filter methods, it is considered as pre-processing to rank the features wherein the highly ranked features are selected and applied to a classifier. There are some useful filter methods, such as Correlation criteria algorithm, and Mutual Information algorithm (Kwak & Choi, 2002; Lazar et al., 2012). In wrapper methods, the classifier is considered as a search algorithm to find a subset which can get the highest recognition performance. There are some useful wrapper methods, such as Sequential Feature Selection (SFS) algorithm, Sequential Backward Selection (SBS) algorithm, Sequential Floating Forward Selection (SFFS) algorithm, Genetic Algorithm (GA) algorithm (Pudil et al., 1994; Reunanen, 2003; Xue et al., 2016). Embedded methods consider the feature selection as part of the training process without splitting the data into training and testing sets (Langley, 2014; Sheikhpour et al., 2017). Ensemble feature selection is a new technique being used to obtain a stable feature subset (Abeel, Helleputte, de Peer, Dupont, & Saeys, 2010). A single feature selection algorithm is used in different subsets of data samples by bootstrapping method. The results are aggregated to get a final feature set. Abeel et al. (2010) use filter methods to rank the features and use different aggregation methods such as ensemble-mean, weighted aggregation methods, linear aggregation to get the final feature subset.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing