Similarity-Based Learning Flashcards

1
Q

True or False: k-NN is robust to outliers.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

True or False: k-NN is always robust to missing values.

A

False.
k-NN is somewhat robust to missing values if there are not too many missing values across various features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

True or False: The sensitivity of k-NN to noise is dependent on the value of k.

A

True.
If k is small, k-NN is very sensitive to noise.
Increasing k makes the algorithm less sensitive to noise, but less accurate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

True or False: The sensitivity of k-NN to imbalance data is independent of k.

A

False.
When k is small, k-NN is less sensitive to imbalance.
When k is large, k-NN is more sensitive to imbalance (as k increases, majority class will overcome minority class).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is inductive bias in the context of k-NNs? How can inductive bias be mitigated?

A

Instances close to each other are the same class.

Under sample majority class using Tomek links.
Over sample minority class using SMOTE.
This will resolve the imbalance.

Another method is Shepard’s method (inverse distance weighting). Neighbors are penalized based on distance from test instance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

True or False: There is no training involved in k-NN.

A

True.
It only involves inducing a model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Complete: The K in KD trees represents ____ but the K in the KNN algorithm represents ____

A

K in KD trees represents the number of descriptive features used to represent each instance.

The K in the KNN algorithm represents the number of instances in the test set most similar to the query instance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

True or False: If the k-NN algorithm is applied on a dataset with instances that only have continuous descriptive features and one nominal target feature, the resulting model is a regression model.

A

False.
The k-NN algorithm applied to a dataset with continuous descriptive features and a nominal (categorical) target feature results in a classification model, not a regression model. In this case, the algorithm predicts the category (or class) of the target feature based on the majority class among the k-nearest neighbors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly