
En savoir plus sur le livre
Pontryagin's maximum principle and Bellman's dynamic programming are the two main approaches for solving stochastic optimal control problems. Interestingly, these methods have developed independently, prompting the question of their relationship in this context. Prior research on this relationship, before the 1980s, often presented results in heuristic terms and relied on restrictive assumptions that were not widely applicable. The Pontryagin-type maximum principle involves an adjoint equation, which is an ordinary differential equation (ODE) in deterministic scenarios and a stochastic differential equation (SDE) in stochastic cases. This combination of the adjoint equation, the original state equation, and the maximum condition forms what is known as an (extended) Hamiltonian system. Conversely, Bellman's dynamic programming employs a partial differential equation (PDE)—first-order in deterministic cases and second-order in stochastic ones—known as the Hamilton-Jacobi-Bellman (HJB) equation. The exploration of these two frameworks reveals their foundational roles in understanding and solving stochastic optimal control problems, despite their independent evolution in the literature.
Achat du livre
Stochastic controls, Jiongmin Yong
- Langue
- Année de publication
- 1999
- product-detail.submit-box.info.binding
- (rigide)
Modes de paiement
Il manque plus que ton avis ici.