Multi-choice: Classification Flashcards

1
Q

Q1 (Classification). Maximum number of support vectors in a two-class dataset of N points in a hard SVM? (One choice) 1) 2
2) N
3) N-2
4) N+2
5) N/2
6) 2N

A

Correct item: 2. Explanation: In the worst case, all points lie on the margin, becoming support vectors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Q2 (Classification). Consequences of increasing K in K-nearest neighbor? (Three correct)
1) Overfitting large
2) Smoother boundary
3) Underfitting more possible
4) Classes with few data may disappear
5) Classes with many data may disappear
6) Faster training

A

Correct items: 2, 3, 4. Explanation: Higher K => smoother boundary, more underfitting, and small classes can vanish in majority voting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Q3 (Classification). Bayesian Logistic Regression approximate solutions. Which is correct? (One choice)
1) No relation between MAP and Laplace approx.
2) Laplace approx. needs sampling
3) Laplace approx. is exact if posterior is Gaussian

A

Correct item: 3. Explanation: Laplace approximation is indeed exact if the posterior is exactly Gaussian.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Q4 (Classification). About performance metrics. (One choice) 1) Specificity & sensitivity are independent
2) If TP rises, TN always rises
3) We want high TP and TN
4) We want high TP and FP

A

Correct item: 3. Explanation: We aim to maximize true positives and true negatives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Q5 (Classification). What is true about Bayes classifier? (One choice)
1) Training is faster than k-NN
2) Same decision boundary as SVM
3) Naive Bayes has a faster training phase than k-NN
4) Naive Bayes classifier makes its training more efficient

A

Correct item: 4. Explanation: Naive Bayes uses conditional independence, making training simpler and more efficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly