Simple sample bound for feedforward sigmoid networks with bounded weights

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

We consider learning a function f: [0, 1](d) → [0, 1] by using feedforward sigmoid networks with a single hidden layer and bounded weights. Using the Lipschitz property, we provide a simple sample size bound within the framework of probably approximately correct learning. The sample size is linear in the number of parameters and logarithmic in the bound on weights. This brief note provides a non-trivial function estimation problem using sigmoid neural networks for which the sample size is a linear function of the number of network parameters.

Original languageEnglish
Pages (from-to)115-122
Number of pages8
JournalNeurocomputing
Volume29
Issue number1-3
DOIs
StatePublished - Nov 1999

Funding

The critical and helpful comments of the anonymous reviewers, which improved the perspective and presentation of this paper, are gratefully acknowledged. This research is sponsored by the Engineering Research Program of the Office of Basic Energy Sciences, of the US Department of Energy, under Contract No. DE-AC05-96OR22464 with Lockheed Martin Energy Research Corp., the Seed Money Program of Oak Ridge National Laboratory, and the Office of Naval Research under order N00014-96-F-0415.

FundersFunder number
Lockheed Martin Energy Research Corp.
Office of Basic Energy Sciences
US Department of EnergyDE-AC05-96OR22464
Office of Naval ResearchN00014-96-F-0415
Oak Ridge National Laboratory

    Keywords

    • Neural networks
    • Probably approximately correct
    • Sample size
    • Sigmoid feedforward networks

    Fingerprint

    Dive into the research topics of 'Simple sample bound for feedforward sigmoid networks with bounded weights'. Together they form a unique fingerprint.

    Cite this