Classifcation Flashcards
1
Q
Kernel Trick
A
decision boundary can be linear in high-dimensional space & non-linear in the input space
2
Q
K-Fold Cross Validation
A
- Create K folds of the data
- for each fold, use k as the test set and the other folds as the trainings sets.
- The error is the average of the seperate estimates.
This is good for small datasets.
3
Q
Leave-one-out cross validation
A
Like K-Fold but K=dataset size.
4
Q
SVM
A
- Robust to choice of kernel function
5
Q
Random Forests
A
6
Q
A