Clip

Markov Decision Processes, Fully and Partially Observed
This episode is an introduction to Markov decision processes, a model for predicting future outcomes based on the current state of a system and the probabilistic changes that might occur from taking different actions. There is also a discussion on the problems of partially observed systems and how partially observed Markov decision processes address this issue.