Feature selection Flashcards
What is feature selection? Why do we need it? 👶
Feature Selection is a method used to select the relevant features for the model to train on. We need feature selection to remove the irrelevant features which leads the model to under-perform.
Is feature selection important for linear models? ⭐️
Yes, It is. It can make model performance better through selecting the most importance features and remove irrelevant features in order to make a prediction and it can also avoid overfitting, underfitting and bias-variance tradeoff.
Which feature selection techniques do you know? ⭐️
Here are some of the feature selections:
Principal Component Analysis
Neighborhood Component Analysis
ReliefF Algorithm
Can we use L1 regularization for feature selection? ⭐️
Can we use L2 regularization for feature selection? ⭐️
Can we use L2 regularization for feature selection? ⭐️
Yes, because the nature of L1 regularization will lead to sparse coefficients of features. Feature selection can be done by keeping only features with non-zero coefficients.