Library

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Algorithmica 8 (1992), S. 103-117 
    ISSN: 1432-0541
    Keywords: Satisfiability ; Combinatorial optimization ; Stable roommates problem ; Approximation algorithm
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract Given a satisfiable Boolean formula in 2-CNF, it is NP-hard to find a satisfying assignment that contains a minimum number of true variables. A polynomial-time approximation algorithm is given that finds an assignment with at most twice as many true variables as necessary. The algorithm also works for a weighted generalization of the problem. An application to the optimal stable roommates problem is given in detail, and other applications are mentioned.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 19 (1995), S. 183-208 
    ISSN: 0885-6125
    Keywords: DNF formulas ; PAC learning ; concept learning ; computational learning theory ; greedy heuristic
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract We present two related results about the learnability of disjunctive normal form (DNF) formulas. First we show that a common approach for learning arbitrary DNF formulas requires exponential time. We then contrast this with a polynomial time algorithm for learning “most” (rather than all) DNF formulas. A natural approach for learning boolean functions involves greedily collecting the prime implicants of the hidden function. In a seminal paper of learning theory, Valiant demonstrated the efficacy of this approach for learning monotone DNF, and suggested this approach for learning DNF. Here we show that no algorithm using such an approach can learn DNF in polynomial time. We show this by constructing a counterexample DNF formula which would force such an algorithm to take exponential time. This counterexample seems to capture much of what makes DNF hard to learn, and thus is useful to consider when evaluating the run-time of a proposed DNF learning algorithm. This hardness result, as well as other hardness results for learning DNF, relies on the construction of particular hard-to-learn formulas, formulas that appear to be relatively rare. This raises the question of whether most DNF formulas are learnable. For certain natural definitions of “most DNF formulas,” we answer this question affirmatively.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 2 (1988), S. 371-396 
    ISSN: 0885-6125
    Keywords: Cluster analysis ; conceptual clustering ; analysis of algorithms
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract Research in cluster analysishas resulted in a large number of algorithms and similarity measurements for clustering scientific data. Machine learning researchers have published a number of methods for conceptual clustering, in which observations are grouped into clusters that have “good” descriptions in some language. In this paper we investigate the general properties that similarity metrics, objective functions, and concept description languages must have to guarantee that a (conceptual) clustering problem is polynomial-time solvable by a simple and widely used clustering technique, the agglomerative-hierarchical algorithm. We show that under fairly general conditions, the agglomerative-hierarchical method may be used to find an optimal solution in polynomial time.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 19 (1995), S. 183-208 
    ISSN: 0885-6125
    Keywords: DNF formulas ; PAC learning ; concept learning ; computational learning theory ; greedy heuristic
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract We present two related results about the learnability of disjunctive normal form (DNF) formulas. First we show that a common approach for learning arbitrary DNF formulas requires exponential time. We then contrast this with a polynomial time algorithm for learning “most” (rather than all) DNF formulas. A natural approach for learning boolean functions involves greedily collecting the prime implicants of the hidden function. In a seminal paper of learning theory, Valiant demonstrated the efficacy of this approach for learning monotone DNF, and suggested this approach for learning DNF. Here we show that no algorithm using such an approach can learn DNF in polynomial time. We show this by constructing a counterexample DNF formula which would force such an algorithm to take exponential time. This counterexample seems to capture much of what makes DNF hard to learn, and thus is useful to consider when evaluating the run-time of a proposed DNF learning algorithm. This hardness result, as well as other hardness results for learning DNF, relies on the construction of particular hard-to-learn formulas, formulas that appear to be relatively rare. This raises the question of whether most DNF formulas are learnable. For certain natural definitions of “most DNF formulas,” we answer this question affirmatively.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 2 (1988), S. 371-396 
    ISSN: 0885-6125
    Keywords: Cluster analysis ; conceptual clustering ; analysis of algorithms
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract Research in cluster analysis has resulted in a large number of algorithms and similarity measurements for clustering scientific data. Machine learning researchers have published a number of methods for conceptual clustering, in which observations are grouped into clusters that have “good” descriptions in some language. In this paper we investigate the general properties that similarity metrics, objective functions, and concept description languages must have to guarantee that a (conceptual) clustering problem is polynomial-time solvable by a simple and widely used clustering technique, the agglomerative-hierarchical algorithm. We show that under fairly general conditions, the agglomerative-hierarchical method may be used to find an optimal solution in polynomial time.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 5 (1990), S. 117-120 
    ISSN: 0885-6125
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 25 (1996), S. 151-193 
    ISSN: 0885-6125
    Keywords: Description logic ; polynomial-time learning ; Classic ; subsumption ; queries ; knowledge acquisition
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract Description logics, also called terminological logics, are commonly used in knowledge-based systems to describe objects and their relationships. We investigate the learnability of a typical description logic, Classic, and show that Classic sentences are learnable in polynomial time in the exact learning model using equivalence queries and membership queries (which are in essence, “subsumption queries”—we show a prediction hardness result for the more traditional membership queries that convey information about specific individuals). We show that membership queries alone are insufficient for polynomial time learning of Classic sentences. Combined with earlier negative results (Cohen & Hirsh, 1994a) showing that, given standard complexity theoretic assumptions, equivalence queries alone are insufficient (or random examples alone in the PAC setting are insufficient), this shows that both sources of information are necessary for efficient learning in that neither type alone is sufficient. In addition, we show that a modification of the algorithm deals robustly with persistent malicious two-sided classification noise in the membership queries with the probability of a misclassification bounded below 1/2. Other extensions are considered.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 4 (1989), S. 41-65 
    ISSN: 0885-6125
    Keywords: concept learning ; classification ; pac-learning ; Boolean formulas ; polynomial-time identification
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract The distribution-independent model of (supervised) concept learning due to Valiant (1984) is extended to that of semi-supervised learning (ss-learning), in which a collection of disjoint concepts is to be simultaneously learned with only partial information concerning concept membership available to the learning algorithm. It is shown that many learnable concept classes are also ss-learnable. A new technique of learning, using an intermediate oracle, is introduced. Sufficient conditions for a collection of concept classes to be ss-learnable are given.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 37 (1999), S. 89-110 
    ISSN: 0885-6125
    Keywords: dualization ; monotone CNF/DNF ; read-restricted formulas ; output polynomial-time algorithms ; incremental algorithms ; learning monotone formulas ; membership queries
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract We consider exact learning monotone CNF formulas in which each variable appears at most some constant k times ("read-k" monotone CNF). Let f : {0,1} n → {0,1} be expressible as a read-k monotone CNF formula for some natural number k. We give an incremental output polynomial time algorithm for exact learning both the read-k CNF and (not necessarily read restricted) DNF descriptions of f. The algorithm's only method of obtaining information about f is through membership queries, i.e., by inquiring about the value f(x) for points x ∈ {0,1} n . The algorithm yields an incremental polynomial output time solution to the (read-k) monotone CNF/DNF dualization problem. The unrestricted versions remain open problems of importance.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Springer
    Machine learning 4 (1989), S. 41-65 
    ISSN: 0885-6125
    Keywords: concept learning ; classification ; pac-learning ; Boolean formulas ; polynomial-time identification
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract The distribution-independent model of (supervised) concept learning due to Valiant (1984) is extended to that of semi-supervised learning (ss-learning), in which a collection of disjoint concepts is to be simultaneously learned with only partial information concerning concept membership available to the learning algorithm. It is shown that many learnable concept classes are also ss-learnable. A new technique of learning, using an intermediate oracle, is introduced. Sufficient conditions for a collection of concept classes to be ss-learnable are given.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...