Library

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Key words. Recurrent neural networks, Folding networks, Computational learning theory, PAC learning, VC dimension.  (1)
Material
Years
Keywords
  • Key words. Recurrent neural networks, Folding networks, Computational learning theory, PAC learning, VC dimension.  (1)
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Mathematics of control, signals, and systems 12 (1999), S. 62-79 
    ISSN: 1435-568X
    Keywords: Key words. Recurrent neural networks, Folding networks, Computational learning theory, PAC learning, VC dimension.
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology , Mathematics , Technology
    Notes: Abstract. We establish some general results concerning PAC learning: We find a characterization of the property that any consistent algorithm is PAC. It is shown that the shrinking width property is equivalent to PUAC learnability. By counterexample, PAC and PUAC learning are shown to be different concepts. We find conditions ensuring that any nearly consistent algorithm is PAC or PUAC, respectively.¶The VC dimension of recurrent neural networks and folding networks is infinite. For restricted inputs, however, bounds exist. The bounds for restricted inputs are transferred to folding networks.¶We find conditions on the probability of the input space ensuring polynomial learnability: the probability of sequences or trees has to converge to zero sufficiently fast with increasing length or height.¶Finally, we find an example for a concept class that requires exponentially growing sample sizes for accurate generalization.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...