10-Iterative optimisation Flashcards
What is iterative optimisation?
Iterative optimisation is an alternative to exact optimisation. Iterative optimisation is used when there is no closed form solution exists that can be used for exact optimisation
Instead we have to iteratively improve our estimate until we arrive at a satisfactory solution.
What is gradient descent?
Gradient descent is a method to do iterative optimisation. It works by taking the partial derivative of each variable and shifting in that direction based on a step size.
What is the issue with gradient descent?
Gradient descent, can be stuck at a local minimum if the function is non-convex
Why is gradient descent significant?
Gradient descent enables efficient optimisation, when direct methods are infeasible. It allows models to be tuned, e.g. neural networks. It is useful when dealing with large datasets, by processing data in chunks.