Convex Optimisation Flashcards
What is a convex function?
A function π(π₯) is convex if for any two points
π₯1 and π₯2 and any π‘ β [0,1], the inequality π(π‘π₯1 + (1βπ‘) π₯2) β€ π‘π(π₯1)+(1βπ‘) π(π₯2) holds. Example: π(π₯) = π₯2
.
What defines a convex set?
A set π is convex if the line segment joining any two points in π entirely lies within π.
What is a convex optimization problem?
A convex optimization problem seeks to minimize a convex function over a convex set.
How does the gradient descent algorithm work in convex optimization?
Gradient descent iteratively adjusts parameters in the direction opposite to the gradient of the function at that point to find the local minimum, which is also the global minimum in convex problems.
Describe Newtonβs Method in the context of convex optimization.
Newtonβs Method uses both the first and second derivatives to find a better approximation of the minimum faster by considering the curvature of the function.
What is the purpose of line search in optimization algorithms?
Line search determines the optimal step size in a given direction to efficiently decrease the objective function and ensure convergence.
What defines a good descent direction in optimization methods?
A good descent direction is identified using local information to ensure it leads to a decrease in the function value. Methods like gradient descent use the negative gradient as the descent direction.
What are common challenges faced by the gradient descent method?
Challenges include slow convergence on flat surfaces, getting stuck at saddle points, and the need for careful step size selection to avoid overshooting.