MODULE 2 S3.2.3 Flashcards

Gradient Boosting Machine (Supplementary)

1
Q

A powerful ensemble technique in machine learning, where the models are built in series.

A

Boosting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Combines the predictions of multiple weak learners to create a single, more accurate strong learner.

A

Boosting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In each successive model in boosting, the _____________ are adjusted based on the learning of the previous model.

A

weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Boosting Algorithms in ML

A

AdaBoost (Adaptive Boosting)
Gradient Boosting
Stochastic Gradient Boosting
LPBoost (Linear Programming Boosting)
TotalBoost (Total Boosting)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Boosting algorithm that builds shallow decision trees (also known as “decision stumps”).

A

AdaBoost

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

It is a type of decision tree that consists of a single decision node and two leaf nodes (shallow tree).

A

Decision stump

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A boosting algorithm that works by fitting new models to the residual errors of prior models.

A

Gradient Boosting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Similar to Gradient Boosting, ____________ fits each new model with random subsets of the training data and random subsets of the features.

A

Stochastic Gradient Boosting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

A boosting algorithm that minimizes the exponential loss function using linear programming.

A

LPBoost

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

An AdaBoost and LPBoost boosting method.
It works by minimizing a mixture of exponential and linear programming losses, and it can increase accuracy for certain types of problems.

A

TotalBoost

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Full form of:
AdaBoost : __________________
LPBoost : _________________
TotalBoost : ________________

A

Adaptive Boosting
Linear Programming Boosting
Total Boosting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How does Boosting work?

A
  1. Initialize weights
  2. Train a weak learner
  3. Error calculation
  4. Update weights
  5. Repeat
  6. Combine weak learners
  7. Forecast
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

An ensemble machine learning technique that combines a collection of weak models into a single, more accurate and efficient predictive model.

A

Gradient Boosting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In GBM, the goal is to __________ samples that were incorrectly categorized in previous iterations, allowing the model to ___________ from its mistakes and improve its performance iteratively.

A

prioritize
learn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

T/F Gradient boosting algorithms work iteratively by adding new models sequentially, with each new addition aiming to resolve the errors made by the previous ones.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

The final prediction of the aggregate represents the __________ of the individual predictions of all the models.

A

sum

17
Q

Gradient boosting combines the ________________ and _________________.

A

gradient descent algorithm
boosting method

18
Q

______________ and ______________________ are key components of the process of boosting.

A

pseudo-residuals
decision trees on residuals