k-Nearest Neighbors Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What if n is for and both options are in with 2 each? How to decide where the point will go to?

A

Reduce k until we have a unique winner, which means throw the farthest away out.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to evaluate if knn is the best option for the data?

A

Plot it. Since it looks like nearby places tend to like the same language, k-nearest neighbors seems
like a reasonable choice for a predictive model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Explain the curse of dimensionality!

A

Thus knn is based on distance computation, there is a problem with higher dimensionality. The more dimensionality you have, your points have to be close to each other in every dimension, else they will not be really closer than average. The ration betwwen average distance and points distance evaporates nearly. Thus two points beeing close doesn´t means much anymore. A possible solution for this is to do a dimnesionality reduction beforehand.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly