Classification 1 Flashcards

1
Q

Perceptron

A

Error-driven learning approach similar to logistic regression
Decision rule: wTx > 0
Gradient update (if incorrect):
- true pos: w <- w+x
- true neg: w <- w-x
- each training instance per update

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Loss

A

Classification can be viewed as minimizing loss.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Online Optimization

A

Gradient over randomly sampled example

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Batch Optimization

A

Gradient over entire dataset. Inefficient over large training sets

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Mini-batch Optimization

A

Each time, calculating gradients from small subsets of examples. Most commonly used now

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Regularization

A

Keeping weights small to prevent overfitting. Ex: applying L2-norm penalty to all weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Evaluation

A

Calculates precision, recall, and accuracy using number of true/false positives and true/false negatives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Precision Equation

A

precision = tp / (tp+fp)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Recall Equation

A

recall = tp / (tp+fn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Accuracy Equation

A

accuracy = precision + recall = (tp+tn) / (tp+fp+tn+fn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Generative Models

A
  • probabilistic models of P(x,y)
  • compute P(y|x) by PREDICTING argmax y P(y|x) to classify
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Discriminative Models

A
  • model P(y|x) directly by COMPUTING argmax y P(y|x)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Naive Bayes

A
  • generative model
  • computes P(y|x) by PREDICTING argmax y P(y|x) to classify
How well did you know this?
1
Not at all
2
3
4
5
Perfectly