Foundations Of Non Stationary Dynamic Programming With Discrete Time Parameter

Download Foundations Of Non Stationary Dynamic Programming With Discrete Time Parameter PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Foundations Of Non Stationary Dynamic Programming With Discrete Time Parameter book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.
Foundations of Non-stationary Dynamic Programming with Discrete Time Parameter

Author: K. Hinderer
language: en
Publisher: Springer Science & Business Media
Release Date: 2012-12-06
The present work is an extended version of a manuscript of a course which the author taught at the University of Hamburg during summer 1969. The main purpose has been to give a rigorous foundation of stochastic dynamic programming in a manner which makes the theory easily applicable to many different practical problems. We mention the following features which should serve our purpose. a) The theory is built up for non-stationary models, thus making it possible to treat e.g. dynamic programming under risk, dynamic programming under uncertainty, Markovian models, stationary models, and models with finite horizon from a unified point of view. b) We use that notion of optimality (p-optimality) which seems to be most appropriate for practical purposes. c) Since we restrict ourselves to the foundations, we did not include practical problems and ways to their numerical solution, but we give (cf.section 8) a number of problems which show the diversity of structures accessible to non stationary dynamic programming. The main sources were the papers of Blackwell (65), Strauch (66) and Maitra (68) on stationary models with general state and action spaces and the papers of Dynkin (65), Hinderer (67) and Sirjaev (67) on non-stationary models. A number of results should be new, whereas most theorems constitute extensions (usually from stationary models to non-stationary models) or analogues to known results.
Control and System Theory of Discrete-Time Stochastic Systems

Author: Jan H. van Schuppen
language: en
Publisher: Springer Nature
Release Date: 2021-08-02
This book helps students, researchers, and practicing engineers to understand the theoretical framework of control and system theory for discrete-time stochastic systems so that they can then apply its principles to their own stochastic control systems and to the solution of control, filtering, and realization problems for such systems. Applications of the theory in the book include the control of ships, shock absorbers, traffic and communications networks, and power systems with fluctuating power flows. The focus of the book is a stochastic control system defined for a spectrum of probability distributions including Bernoulli, finite, Poisson, beta, gamma, and Gaussian distributions. The concepts of observability and controllability of a stochastic control system are defined and characterized. Each output process considered is, with respect to conditions, represented by a stochastic system called a stochastic realization. The existence of a control law is related to stochastic controllability while the existence of a filter system is related to stochastic observability. Stochastic control with partial observations is based on the existence of a stochastic realization of the filtration of the observed process.