1 kNN Classification Flashcards

1
Q

What is DL?

A

DL is ML with the addition of the learning process being handled by a NN.
- Layered architecture of computational units.
- High-capacity models possible, called ANN’s or DNN’s
- Restricted to DNN (mores than 1 hidden layer)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are S+C cells?

A

Simple → Complex → Hypercomplex cells

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Marr’s theory on visual cortex?

A
  • s+c cells compute a primal skecth
  • The brain computes a 2.5D sketch using texture
  • A 3D model is computed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is texture based descriptors?

A

Methods that cann identify and use these patterns for tasks like image recognition or classification.
- SIFT, SURF, HoG

Texture - visual patterns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Representation learning?

A

Learn both the representation an dhow to produce the required output.

Refers to ML techniques focused on learning and extracting meaningful representations or features from raw data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are well known datasets?

A
  • MNIST → small and fast
  • PASCAL → classification, detection and more
  • ImageNet → classification + localization, deection
  • COCO → detection, segmentation

Both YOLO and SSD is able to do real time detection.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What types of ML tasks are there?

A
  • Classification
  • Regression
  • Compression
  • Clustering
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What advances does DL have?

Måske overflødig?

A
  • Sequence (e.g. speech) processing
  • Machine translation
  • Image segmentation
  • Image captioning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does Bayes rule say?

A

p(C_k│x)=( p(x│C_k )p(C_k) ) / ( p(x) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is AI?

A
  • The concept of “human-like” machines.
  • Many, many subdisciplines
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is ML?

A
  • Focus on data
  • Instead of being designed, the machine learns a model from the data.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What do texture based (local) descriptors do?

A

Go from image gradients to keypoint descriptor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How does testing work in the kNN classifier?

A

For each data point (image), find the k nearest training examples.
Decide on label by majority vote.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How does training work in the kNN classifier?

A

Simply store the images and labels.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What parameters are needed for kNN?

A

Data point x → given class C → result
K nearest datapoints → hyperparameter
Total number of training data points N → Countable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

For data point x, how do you decide on its class C?

A

For each k’th class calculate:
p(C_k|x)
The winning class is the one with the highest probability

17
Q

How do you calculate the probability of x conditioned on class k?

A

p(x|C_k)= K_k / N_k

K_k → Number of hits in that class, N_k → Total number rof entities in that class

18
Q

How do you calculate the unconditioned probability of x?

A

p(x)= K / N

K → hyper parameter, N → Total number

19
Q

What is class prior?

A

Unconditional probability of C_k

Probability of encountering instances of class C_k in the absence of specific information about the features.

20
Q

How do you calculate the unconditional probability of C_k?

A

p(C_k)= N_k / N

K_k → Total number of entities in that class
N → Total numberr of entities

21
Q

What is the final solution of kNN?

A

p(C_k│x)=( p(x│C_k )p(C_k) ) / ( p(x) )=K_k/K

22
Q

What are some rules you should know?

A
  • Dont make k too large (biggest groups wins always)
  • k should be uneven number
  • Flip a coin if tied
  • Low values of 1 or 2 can be noisy (outliers)
  • Use Elbow method to compare the accuracies for different values of K.
  • Use euclidean distance
23
Q

Example of image tasks?

A
  • Image classification
  • Single-object localization
  • Object detection
24
Q

WHAT IS THE CANNY EDGE detector?

A
  • Very early computational approach to vision.
  • Find string intensity discontinuties in grayscale
25
Q

What does the Harris corner detector do?

A

Uswe local pixel statistics to find cornsers instead of edges.

26
Q

What does KNN do?

A

Make decision boundaries.