Linear classification Flashcards

1
Q

Data-set

A

x vector of d attributes
x=[x1,x2,…,xd]
y=f(x)
data set {(x1,y1),…,(xN,yN)}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Hypothesis set

A

Perceptron:
h(x) = sign(sum(i=1,d) wi*xi + b)
linear combination of the attributes, plus a bias b

  • > h(x) = sign(sum(i=0,d) wi*xi)
    where w0 = b and x0 = 1
    unknowns: w=[w0,w1,…,wd]’
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Learning algorithm

A

Perceptron Learning algorithm PLA

Main assumption: there exist a separating hyperplane (data must be linearly separable)

Idea:

  • start from a misclassified point
  • update w such that the point is correctly classified
  • > update rule: w(i+1) = w(i) + y(i)*x(i)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Convergence of the PLA

A

PLA converges to a perfect classification in a finite number of iterations
- > Ein(h) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Generalization of PLA

A

Eout(h) = P[h(x) != f(x)] = Ein(h) + O( sqrt(d/N*ln(N) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Confusion matrix

A

[ a b
c d ]

a: true positive
b: false positive

c: false negative
d: true negative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Precision

A

Precision or positive predicted value

PPV = a/(a+b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Recall

A

Recall or true positive rate

TPR = a/(a+c)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

F1-score

A

F1 = 2* PPV*TPR/(PPV+TPR)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Performance indexes

A
  • Precision or positive predicted value
  • Recall or true positive rate
  • F1-score
How well did you know this?
1
Not at all
2
3
4
5
Perfectly