Skip to main content
Log in

On the Learnability of Recursive Data

  • Published:
Mathematics of Control, Signals and Systems Aims and scope Submit manuscript

Abstract.

We establish some general results concerning PAC learning: We find a characterization of the property that any consistent algorithm is PAC. It is shown that the shrinking width property is equivalent to PUAC learnability. By counterexample, PAC and PUAC learning are shown to be different concepts. We find conditions ensuring that any nearly consistent algorithm is PAC or PUAC, respectively.¶The VC dimension of recurrent neural networks and folding networks is infinite. For restricted inputs, however, bounds exist. The bounds for restricted inputs are transferred to folding networks.¶We find conditions on the probability of the input space ensuring polynomial learnability: the probability of sequences or trees has to converge to zero sufficiently fast with increasing length or height.¶Finally, we find an example for a concept class that requires exponentially growing sample sizes for accurate generalization.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Date received: September 5, 1997. Date revised: May 29, 1998.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hammer, B. On the Learnability of Recursive Data. Math. Control Signals Systems 12, 62–79 (1999). https://doi.org/10.1007/PL00009845

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/PL00009845

Navigation