Week 7 Flashcards

1
Q

What is the form of each h in H in the linear model for classifying data?

A

h(x) = sign(wT * x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Give the formula for the out-of-sample error using the VC generalization bound and the bound on the growth function in terms of the VC dimension:

A

Eout(g) =

Ein(g) + O ( the root of ((d/N) * ln(N)) )

with O(x) = de absolute waarde van x is kleiner dan een meervoud hiervan. , d= dimension and N= sample size.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is forward propagation?

A

Algorithmic computation h(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is backward propagation?

A

algorithmic computation [weird a] e(x) / [weird a] weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

HOw is the hypothesis h(X) computed in forward propagation?

A

For every layer, add an input 1. So x(l) = [1 theta(s(l))]
The outputs of the previous layer l-1 into the input vector of current layer l.
Calculate s(l).j and s(l)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does the neural network do differently than MLP?

A

It applies a non-linear function to the input of each node before passing it to the output. Each neuron/node is connected to all the neurons in the next layer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is l, L in NN?

A

l is a layer. L is the output layer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is w(l).i,j in NN?

A

The weights for layer l, from input i to output j.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do you calculate the output x(l).j of a neuron in layer l?

A

x(l).j = theta (s(l).j) = theta(w(l).i,j * x(l-1).i)
for all i in d(l-1), so all previous layers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the dimension of the matrix of weights W(l) for a layer l?

A

(d(l-1) + 1) x d(l)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the i,j -th entry in W(l)

A

w(l).i,j
going from node i in previous layer l-1 to node j in layer l

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

s(l).j =

A

w(l).i,j * x(l-1).i for all i in d(l-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

s(l) =

A

W(l).T * x(l-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is E.in(h) formula in NN?

A

E.in(h) = 1/N * h(xn;w) - yn) ^2
for all n in N

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the error on one example (xn, yn) in NN?

A

e(h(xn), yn) = e(w)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly