Feature Selection Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is feature selection? Why do we need it?

A

Feature Selection is a method used to select the relevant features for the model to train on. We need feature selection to remove the irrelevant features which leads the model to under-perform.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Is feature selection important for linear models?

A

Yes, It is. It can make model performance better through selecting the most importance features and remove irrelanvant features in order to make a prediction and it can also avoid overfitting, underfitting and bias-variance tradeoff.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which feature selection techniques do you know?

A

Here are some of the feature selections:

Principal Component Analysis
Neighborhood Component Analysis
ReliefF Algorithm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Can we use L1 regularization for feature selection?

A

Yes, because the nature of L1 regularization will lead to sparse coefficients of features. Feature selection can be done by keeping only features with non-zero coefficients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Can we use L2 regularization for feature selection?

A

No, Because L2 regularization does not make the weights zero but only makes them very very small. L2 regularization can be used to solve multicollinearity since it stablizes the model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly