# Markov Chain In a **Markov Chain** we have a state space and a transition operator. The Markov Chain is a tuple consisting of these two things: $M = \{ S, T\}$ ![](Screen%20Shot%202021-12-07%20at%207.16.36%20AM.png) --- Date: 20211207 Links to: [Markov-Decision-Process](Markov-Decision-Process.md) [Probability MOC](Probability%20MOC.md) Tags: References: * [From UC Berkeley RL Course](https://youtu.be/jds0Wh9jTvE?list=PL_iWQOsE6TfURIIhCrlt-wj9ByIVpbfGc&t=420)