# Maximum Likelihood
Maximum likelihood is actual an *incredibly simple* idea:
> Given a set of $n$ observations and some model that you deem they were generated from (for example, a normal distribution), find the parameters that **maximize** the **likelihood** of those $n$ observations being generated.

Looking at the visual above, we see that by simply sliding our distribution (trying all possible values of our parameter) we can find the model parameter that maximizes our likelihood. In general, we could use the likelihood function on the right in combination with the [Gradient](Gradient.md) (derivative) to find this value. See [here](https://www.nathanieldake.com/Mathematics/05-Information_Theory-01-Cross-Entropy-and-MLE-walkthrough.html).
---
Date: 20210521
Links to:
References:
* [Bayesian inference, fantastic visual](https://www.nathanieldake.com/Machine_Learning/08-Bayesian_Machine_Learning-01-Bayesian-Inference.html#Visual-Representation)
* [Cross Entropy and MLE](https://www.nathanieldake.com/Mathematics/05-Information_Theory-01-Cross-Entropy-and-MLE-walkthrough.html)