MODULE 2 S3.2.3 Flashcards
Gradient Boosting Machine (Supplementary)
A powerful ensemble technique in machine learning, where the models are built in series.
Boosting
Combines the predictions of multiple weak learners to create a single, more accurate strong learner.
Boosting
In each successive model in boosting, the _____________ are adjusted based on the learning of the previous model.
weights
Boosting Algorithms in ML
AdaBoost (Adaptive Boosting)
Gradient Boosting
Stochastic Gradient Boosting
LPBoost (Linear Programming Boosting)
TotalBoost (Total Boosting)
Boosting algorithm that builds shallow decision trees (also known as “decision stumps”).
AdaBoost
It is a type of decision tree that consists of a single decision node and two leaf nodes (shallow tree).
Decision stump
A boosting algorithm that works by fitting new models to the residual errors of prior models.
Gradient Boosting
Similar to Gradient Boosting, ____________ fits each new model with random subsets of the training data and random subsets of the features.
Stochastic Gradient Boosting
A boosting algorithm that minimizes the exponential loss function using linear programming.
LPBoost
An AdaBoost and LPBoost boosting method.
It works by minimizing a mixture of exponential and linear programming losses, and it can increase accuracy for certain types of problems.
TotalBoost
Full form of:
AdaBoost : __________________
LPBoost : _________________
TotalBoost : ________________
Adaptive Boosting
Linear Programming Boosting
Total Boosting
How does Boosting work?
- Initialize weights
- Train a weak learner
- Error calculation
- Update weights
- Repeat
- Combine weak learners
- Forecast
An ensemble machine learning technique that combines a collection of weak models into a single, more accurate and efficient predictive model.
Gradient Boosting
In GBM, the goal is to __________ samples that were incorrectly categorized in previous iterations, allowing the model to ___________ from its mistakes and improve its performance iteratively.
prioritize
learn
T/F Gradient boosting algorithms work iteratively by adding new models sequentially, with each new addition aiming to resolve the errors made by the previous ones.
True