Acheter 10 livres pour 10 € ici !
Bookbot

Stochastic controls

Évaluation du livre

4,7(3)Évaluer

Paramètres

  • 438pages
  • 16 heures de lecture

En savoir plus sur le livre

Pontryagin's maximum principle and Bellman's dynamic programming are the two main approaches for solving stochastic optimal control problems. Interestingly, these methods have developed independently, prompting the question of their relationship in this context. Prior research on this relationship, before the 1980s, often presented results in heuristic terms and relied on restrictive assumptions that were not widely applicable. The Pontryagin-type maximum principle involves an adjoint equation, which is an ordinary differential equation (ODE) in deterministic scenarios and a stochastic differential equation (SDE) in stochastic cases. This combination of the adjoint equation, the original state equation, and the maximum condition forms what is known as an (extended) Hamiltonian system. Conversely, Bellman's dynamic programming employs a partial differential equation (PDE)—first-order in deterministic cases and second-order in stochastic ones—known as the Hamilton-Jacobi-Bellman (HJB) equation. The exploration of these two frameworks reveals their foundational roles in understanding and solving stochastic optimal control problems, despite their independent evolution in the literature.

Achat du livre

Stochastic controls, Jiongmin Yong

Langue
Année de publication
1999
product-detail.submit-box.info.binding
(rigide)
Nous vous informerons par e-mail dès que nous l’aurons retrouvé.

Modes de paiement

4,7
Excellent
3 Évaluations

Il manque plus que ton avis ici.