principal component extraction and its feature selection
Post on 30-May-2018
233 Views
Preview:
TRANSCRIPT
-
8/14/2019 Principal Component Extraction and Its Feature Selection
1/13
Principal Component extraction
and its feature selection for ECGbeats
Student : M. Y. Li
Advisor : S. N. Yu
Date:2008/10/24
-
8/14/2019 Principal Component Extraction and Its Feature Selection
2/13
Outline
Introduction.
Principal Component extraction.
Feature selection. Fisher linear discriminant.
Correlation coefficients.
Selection concept.
Experiment results.
Conclusions.
-
8/14/2019 Principal Component Extraction and Its Feature Selection
3/13
Introduction
The ECG is noninvasive in nature and valuable in the diagnosisof heart diseases.
It is the high mortality rate of heart diseases. (i,e arrhythmias)
faithful detection . Classification.
In recent years, many algorithms have been developed for thedetection and classification of the ECG signals.
Feature extraction. PCA
Feature selection. Entropy selection.
Fisher linear discriminant.
-
8/14/2019 Principal Component Extraction and Its Feature Selection
4/13
Principal Component Extraction
PCA can be used for dimensionality reduction in a data
set keeping lower-order principal components low-order components often contain the "most
important" aspects of the data The pca algorithm.
Data zero mean.
Eigen vector decomposed. The eigenvectors with the largest eigenvalues correspond
to the dimensions that have the strongest correlation in thedata set.
-
8/14/2019 Principal Component Extraction and Its Feature Selection
5/13
Principal Component Extraction
Properties and Limitations.
Assumption on Linearity. Assumption on the statistical importance of mean
and covariance. Eigenvectors of the covariance matrix and it only finds
the independent axes of the data under the Gaussianassumption .
When PCA is used for clustering. it does not account for class separability since it
makes no use of the class label of the feature vector.
-
8/14/2019 Principal Component Extraction and Its Feature Selection
6/13
Feature Selection(Fisher linear discriminant)
Fisher Discriminality
Where is the number of features in class i , is the
feature set associated with class i, and are the meanof feature in class and the entire feature set,
respectively.
B
k
W
SS
S=
2
kki
c
1i
iB )f-f(nS =
=
2hiki
c
1i Df
iW )f-(fnSiki
= =
in iD
kif kfthk iD
-
8/14/2019 Principal Component Extraction and Its Feature Selection
7/13
Correlation coefficients
Correlation coefficients(CC) is used to evaluate the dependencybetween two random variables.
is the covariance, and and , are the standarddeviations of , and , features.
kl
kl
k l
=
kl k lthk thl
-
8/14/2019 Principal Component Extraction and Its Feature Selection
8/13
-
8/14/2019 Principal Component Extraction and Its Feature Selection
9/13
Experiment results Methods compare
ICA (chu) ICA+FLD+CC. PCA+FLD+CC. PCA
Data base MIT (arrhythmias) Lead II
The order form m=1 to m=17 Type
Norm,LBBB,RBBB,PB,PVC,APB,VFW,VEB. Total Training 4900 Testing 4900
-
8/14/2019 Principal Component Extraction and Its Feature Selection
10/13
Experiment results
Accuracy compare
order
Methodsm=1 m=4 m=9 m=13 m=15 m=17
ICA 52.59 82.85 94.52 97.26 97.80 98.19
PCA 59.61 92.34 98.02 98.77 98.77 98.73
ICA+FLD+CC
55.53 66.33 77.31 85.95 88.41 90.40
PCA+FLD+CC
59.61 91.98 97.89 98.34 98.59 98.73
Accuracy
-
8/14/2019 Principal Component Extraction and Its Feature Selection
11/13
Experiment results
The other diseases compare (m=4 ,m=9,m=17) PB
VFW
order
methodm=4 m=9 m=17
ICA 98.90 99.65 99.85PCA 97.00 100.00 100.00
ICA+FLD+CC 55.47 82.95 99.37
PCA+FLD+CC 98.50 99.75 100.00
order
Methodm=4 m=9 m=17
ICA 68.47 86.99 90.63
PCA 79.23 84.74 87.71
ICA+FLD+CC 40.80 59.23 78.43
PCA+FLD+CC 73.72 86.86 88.13
-
8/14/2019 Principal Component Extraction and Its Feature Selection
12/13
Experiment results
VEB
order
methodm=4 m=9 m=17
ICA 5.57 78.46 91.73
PCA 0.00 92.30 94.23
ICA+FLD+CC 0.00 5.385 32.69
PCA+FLD+CC 80.76 92.30 94.23
-
8/14/2019 Principal Component Extraction and Its Feature Selection
13/13
Conclusions
top related