TY - CONF
T1 - Nearest neighbor projective fuser for function estimation
AU - Rao, S. V.Nageswara
PY - 2002
Y1 - 2002
N2 - There is currently a wide choice of function estimators, and it is often more effective and practical to fuse them rather than choosing a "best" one. An optimal projective fuser was proposed earlier based on the lower envelope of error regressions of the estimators. In most practical cases, however, the error regressions are not available and only a finite sample is given. Consequently this optimal fuser is hard to implement and furthermore guarantees only the asymptotic consistency. In this paper, we propose a projective fuser based on the nearest neighbor concept, which is easy to implement. Under fairly general smoothness and non-smoothness conditions on the individual estimators, we show that this fuser's expected error is close to optimal with a high probability, for a finite sample and irrespective of the underlying distributions. This performance guarantee is stronger than the previous ones for projective fusers and also implies asymptotic consistency. The required smoothness condition, namely Lipschitz continuity, is satisfied by sigmoid neural networks and certain radial-basis functions. The non-smoothness condition requires bounded variation which is satisfied by k-nearest neighbor, regressogram,regression tree, Nadaraya-Watson and feedforward threshold network estimators.
AB - There is currently a wide choice of function estimators, and it is often more effective and practical to fuse them rather than choosing a "best" one. An optimal projective fuser was proposed earlier based on the lower envelope of error regressions of the estimators. In most practical cases, however, the error regressions are not available and only a finite sample is given. Consequently this optimal fuser is hard to implement and furthermore guarantees only the asymptotic consistency. In this paper, we propose a projective fuser based on the nearest neighbor concept, which is easy to implement. Under fairly general smoothness and non-smoothness conditions on the individual estimators, we show that this fuser's expected error is close to optimal with a high probability, for a finite sample and irrespective of the underlying distributions. This performance guarantee is stronger than the previous ones for projective fusers and also implies asymptotic consistency. The required smoothness condition, namely Lipschitz continuity, is satisfied by sigmoid neural networks and certain radial-basis functions. The non-smoothness condition requires bounded variation which is satisfied by k-nearest neighbor, regressogram,regression tree, Nadaraya-Watson and feedforward threshold network estimators.
KW - Finite-sample guarantees
KW - Nearest neighbor
KW - Projective fusers
KW - Sensor fusion
UR - http://www.scopus.com/inward/record.url?scp=84901395943&partnerID=8YFLogxK
U2 - 10.1109/ICIF.2002.1020943
DO - 10.1109/ICIF.2002.1020943
M3 - Paper
AN - SCOPUS:84901395943
SP - 1154
EP - 1161
T2 - 5th International Conference on Information Fusion, FUSION 2002
Y2 - 8 July 2002 through 11 July 2002
ER -