Gradient Descent#
What is gradient descent?#
Gradient descent is an iterative optimization algorithm that can be used to find a minimum of a function \(f(\theta)\). At each iteration, it takes a step in the direction of the negative gradient \(\nabla f(\theta)\). The algorithm can also be used to find a maximum. In that case, the algorithm should take a step in the direction of the gradient.
How does gradient descent work?#
The gradient descent algorithm is as follows:
Binder#
If you want to experiment with the gradient descent algorithm, you can use the provided Jupyter Notebook. You can run the notebook directly in your browser by using Binder; simply click on the following button to open the notebook: