TY - GEN
T1 - Non-parametric bounds on the nearest neighbor classification accuracy based on the Henze-Penrose metric
AU - Ghanem, Sally
AU - Skau, Erik
AU - Krim, Hamid
AU - Clouse, Hamilton Scott
AU - Sakla, Wesam
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/8/3
Y1 - 2016/8/3
N2 - Analysis procedures for higher-dimensional data are generally computationally costly; thereby justifying the high research interest in the area. Entropy-based divergence measures have proven their effectiveness in many areas of computer vision and pattern recognition. However, the complexity of their implementation might be prohibitive in resource-limited applications, as they require estimates of probability densities which are very difficult to compute directly for high-dimensional data. In this paper, we investigate the usage of a non-parametric distribution-free metric, known as the Henze-Penrose test statistic, to estimate the divergence between different classes of vehicles. In this regard, we apply some common feature extraction techniques to further characterize the distributional separation relative to the original data. Moreover, we employ the Henze-Penrose metric to obtain bounds for the Nearest Neighbor (NN) classification accuracy. Simulation results demonstrate the effectiveness and the reliability of this metric in estimating the inter-class separability. In addition, the proposed bounds are exploited for selecting the least number of features that would retain sufficient discriminative information.
AB - Analysis procedures for higher-dimensional data are generally computationally costly; thereby justifying the high research interest in the area. Entropy-based divergence measures have proven their effectiveness in many areas of computer vision and pattern recognition. However, the complexity of their implementation might be prohibitive in resource-limited applications, as they require estimates of probability densities which are very difficult to compute directly for high-dimensional data. In this paper, we investigate the usage of a non-parametric distribution-free metric, known as the Henze-Penrose test statistic, to estimate the divergence between different classes of vehicles. In this regard, we apply some common feature extraction techniques to further characterize the distributional separation relative to the original data. Moreover, we employ the Henze-Penrose metric to obtain bounds for the Nearest Neighbor (NN) classification accuracy. Simulation results demonstrate the effectiveness and the reliability of this metric in estimating the inter-class separability. In addition, the proposed bounds are exploited for selecting the least number of features that would retain sufficient discriminative information.
KW - Classification
KW - Dimensionality reduction
KW - Divergence measures
KW - Nearest neighbor graph
KW - Pattern recognition
UR - http://www.scopus.com/inward/record.url?scp=85006833165&partnerID=8YFLogxK
U2 - 10.1109/ICIP.2016.7532581
DO - 10.1109/ICIP.2016.7532581
M3 - Conference contribution
AN - SCOPUS:85006833165
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 1364
EP - 1368
BT - 2016 IEEE International Conference on Image Processing, ICIP 2016 - Proceedings
PB - IEEE Computer Society
T2 - 23rd IEEE International Conference on Image Processing, ICIP 2016
Y2 - 25 September 2016 through 28 September 2016
ER -