Ensemble Learning Flashcards

1
Q

Ensemble Learning

A

Aggregating a group of predictors, you will often get better predictions than with the best individual predictor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

2 Way to Constructing Ensemble Method

A

By manipulating training set - Example: bagging, boosting, random forests
By manipulating input features - Example: random forests

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

4 Approach to Manipulating Training Set

A

Bagging - sampling with replacement
Pasting - sampling without replacement
Random Forests
Boosting: An iterative procedure to adaptively change distribution of training data by focusing more on previously misclassified records

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

3 Term of Bias-Variance Decomposition

A

Variance - The distance intervals of predicted value
Noise
Bias - The distance between predicted value and actual value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Bias-Variance Trade-off

A

Overfitting - Low Bias & High Variance
Optimum - Low Bias & Low Variance
Underfitting - High Bias & Low Variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

2 Type of Voting Classifiers

A

Hard Voting Classifiers (Majority-Vote)
Soft Voting Classifiers - Predict the class with the highest class probability, averaged over all the individual classifiers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly