# Gradient Descent vs Gradient Boosting
They're two different algorithms, but there is some connection between them:
**Gradient descent** is an algorithm for finding a set of parameters that optimizes a loss function. Given a loss function $f(x, \theta)$, where $x$ is an $n$-dimensional vector and $\theta$ is a set of parameters, gradient descent operates by computing the gradient of $f$ with respect to $\theta$. It then "descends" the gradient by nudging the parameters in the opposite direction of the gradient. This process is repeated for different points in the space of inputs (i.e. different $x