TITLE: Partially Observable Markov Decision Processes with General State and Action Spaces
SPEAKER: Professor Eugene Feinberg, Stony Brook University
ABSTRACT:
Partially Observable Markov Decision Processes (POMDPs) describe control of stochastic systems whose current states are unknown and information about them is available only via indirect observations. They have a large spectrum of applications to various fields including operations research, electrical engineering, and computer science. In principle, a POMDP can me reduced to a Markov Decision Process (MDP) whose states are belief probabilities, that is, probability distributions on the set of all possible states. Such auxiliary MDPs are called Completely Observable MDPs (COMDPs).
If an optimal policy is found for the COMDP, it is easy to construct an optimal policy for the POMDP. However, the existence of optimal policies for COMDPs (and therefore for POMDPS) and their characterizations via optimality equations have been studied in the literature mainly for problems with finite state sets. Except a few cases, infinite-state problems have been considered only for particular applications. For the expected total costs criteria, we describe natural sufficient conditions for the existence of optimal policies for infinite-state POMDPs and derive optimality equations. The results presented in this talk are based on the recent progress in the analysis of infinite-state MDPs with weakly continuous transition probabilities, which were motivated by applications of MDPs to inventory control, and on new results on continuity properties of Bayes’s formula.
The talk is based on joint work with P.O. Kasyanov and M.Z. Zgurovsky.
Bio:
Eugene A. Feinberg received Ph.D. in Probability and Statistics from Vilnius University, Lithuania, in 1979. Between 1976 and 1988 he held research and faculty positions in the Department of Applied Mathematics at Moscow University of Transportation. After holding a one-year visiting faculty position at Yale University in 1988-89, he joined Stony Brook University, where Dr. Feinberg is currently Distinguished Professor at the Department of Applied Mathematics and Statistics.
His research interests include stochastic models of Operations Research, Markov Decision Processes, and industrial applications of Operations Research and Statistics. Since 1999, he has been working on electric energy applications. He has published more that 130 papers and edited the Handbook on Markov Decision Processes. His research is partially supported by the National Science Foundation, Department of Energy, New York Office of Science, Technology and Academic Research (NYSTAR), New York State Energy Research and Development Authority (NYSERDA), and industry. He is a member of several editorial boards including Mathematics of Operations Research, Operations Research Letters, and Applied Mathematics Letters. He has been awarded Honorary Doctorate from the Institute of Applied System Analysis, National Technical University of Ukraine. Dr Feinberg is a Fellow of INFORMS (The Institute for Operations Research and Management Sciences), a recipient of 2012 IEEE Charles Hirsh Award “For developing and implementing on Long Island, electric load forecasting methods and smart grid technologies,” and a recipient of 2012 IBM Faculty Award.