Library

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • 1985-1989  (1)
  • decomposition into undiscounted Markov decision sub-processes  (1)
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical methods of operations research 32 (1988), S. 71-93 
    ISSN: 1432-5217
    Keywords: value iteration method ; undiscounted multichain Markov decision process ; ε-optimal policy ; decomposition into undiscounted Markov decision sub-processes ; aggregated Markov decision process ; optimal stopping problem ; sufficient conditions forε-optimality
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics , Economics
    Description / Table of Contents: Zusammenfassung In dieser Arbeit wird eine Wertiterationsmethode vorgeschlagen, die eineε-optimale Politik für einen undiskontierten nicht-irreduziblen Markovschen Entscheidungsprozeß (MEP) in endlichen vielen Schritten liefert. Der undiskontierte nicht-irreduzible MEP wird auf einen aggregierten MEP reduziert, der maximale Gewinn eines undiskontierten Sub-MEP verwendet und als optimales Stopp-Problem formuliert wird. Zu Beginn werden hinreichende Bedingungen für dieε-Optimalität einer Politik angegeben.
    Notes: Abstract This paper proposes a value iteration method which finds anε-optimal policy of an undiscounted multichain Markov decision process in a finite number of iterations. The undiscounted multichain Markov decision process is reduced to an aggregated Markov decision process, which utilizes maximal gains of undiscounted Markov decision sub-processes and is formulated as an optimal stopping problem. As a preliminary, sufficient conditions are presented under which a policy isε-optimal.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...