Abstract
We address the problem of estimating a function f : [0, 1]d → [-L, L] by using feedforward sigmoidal networks with a single hidden layer and bounded weights. The only information about the function is provided by an identically independently distributed sample generated according to an unknown distribution. The quality of the estimate is quantified by the expected cost functional and depends on the sample size. We use Lipschitz properties of the cost functional and of the neural networks to derive the relationship between performance bounds and sample sizes within the framework of Valiant's probably approximately correct learning.
Original language | English |
---|---|
Pages (from-to) | 125-131 |
Number of pages | 7 |
Journal | Neural Processing Letters |
Volume | 7 |
Issue number | 3 |
DOIs | |
State | Published - 1998 |
Funding
This research sponsored by the Engineering Research Program of the Office of Basic Energy Sciences, of the U.S. Department of Energy, under Contract No. DE-AC05-96OR22464 with Lockheed Martin Energy Research Corp.
Keywords
- Feedforward sigmoid networks
- Function estimation
- PAC learning