2.3. Optimization Methods Flashcards
What are optimization models in machine learning?
Mathematical formulations that aim to find the best solution from a set of feasible solutions
What are constraints in optimization models?
Conditions that the solution must satisfy, limiting the feasible region.
What is linear programming?
An optimization technique where the objective function and constraints are linear.
What is the objective function in optimization?
A function that needs to be maximized or minimized based on the problem requirements.
What is the Simplex method?
A popular algorithm for solving linear programming problems.
What is non-linear programming?
Optimization where the objective function or constraints are non-linear.
What is convex optimization?
A subset of optimization where the objective function is convex, ensuring a global minimum.
What is gradient descent?
An iterative optimization algorithm used to minimize a function by moving in the direction of the steepest descent.
What is the role of Lagrange multipliers in optimization?
They are used to find the local maxima and minima of a function subject to equality constraints.
What is the difference between global and local optimization?
Global optimization seeks the best solution across the entire feasible region, while local optimization finds the best solution within a limited neighborhood.
What is integer programming?
An optimization technique where some or all decision variables are constrained to be integers.
What is the primary goal of optimization methods in machine learning?
To optimize a given cost function during the learning stage.
Name one type of optimization method discussed in the document.
Support Vector Machines (SVM).
What is a key characteristic of Artificial Neural Networks (ANN) in the context of optimization?
They are biologically inspired and involve optimizing weights and biases.
What distinguishes optimization methods from other predictive modeling methods?
Optimization methods aim to optimize the entire learning function, not just a part of it.
What are linear methods in optimization?
Techniques that involve linear least squares and regression analysis.
What is the purpose of a cost function in optimization?
To measure how well a model performs, guiding the optimization process.
What is the role of residuals in linear methods?
Residuals represent the difference between observed and predicted values, used to assess model accuracy.
How do Support Vector Machines (SVM) optimize their function?
By finding the hyperplane that maximizes the margin between different classes.
What is a common application of optimization methods in finance?
Portfolio optimization to maximize returns while minimizing risk.
What is the significance of the learning function in optimization?
It represents the relationship between input features and the target variable, which needs to be optimized.
What is the difference between supervised and unsupervised optimization methods?
Supervised methods use labeled data, while unsupervised methods work with unlabeled data.
What is a common challenge in optimization?
Avoiding local minima to ensure the global optimum is found.
Why is it important to evaluate the performance of optimization methods?
To ensure that the model generalizes well to unseen data and meets the desired objectives.
What is the role of hyperparameters in optimization methods?
Hyperparameters are settings that govern the optimization process and can significantly affect model performance.
What is the relationship between optimization and machine learning?
Optimization is a fundamental aspect of training machine learning models, as it directly impacts their accuracy and efficiency.