# Markov Chain
In a **Markov Chain** we have a state space and a transition operator. The Markov Chain is a tuple consisting of these two things:
$M = \{ S, T\}$

---
Date: 20211207
Links to: [Markov-Decision-Process](Markov-Decision-Process.md) [Probability MOC](Probability%20MOC.md)
Tags:
References:
* [From UC Berkeley RL Course](https://youtu.be/jds0Wh9jTvE?list=PL_iWQOsE6TfURIIhCrlt-wj9ByIVpbfGc&t=420)