Week 7 Flashcards
What is the form of each h in H in the linear model for classifying data?
h(x) = sign(wT * x)
Give the formula for the out-of-sample error using the VC generalization bound and the bound on the growth function in terms of the VC dimension:
Eout(g) =
Ein(g) + O ( the root of ((d/N) * ln(N)) )
with O(x) = de absolute waarde van x is kleiner dan een meervoud hiervan. , d= dimension and N= sample size.
What is forward propagation?
Algorithmic computation h(x)
What is backward propagation?
algorithmic computation [weird a] e(x) / [weird a] weights
HOw is the hypothesis h(X) computed in forward propagation?
For every layer, add an input 1. So x(l) = [1 theta(s(l))]
The outputs of the previous layer l-1 into the input vector of current layer l.
Calculate s(l).j and s(l)
What does the neural network do differently than MLP?
It applies a non-linear function to the input of each node before passing it to the output. Each neuron/node is connected to all the neurons in the next layer.
What is l, L in NN?
l is a layer. L is the output layer.
What is w(l).i,j in NN?
The weights for layer l, from input i to output j.
How do you calculate the output x(l).j of a neuron in layer l?
x(l).j = theta (s(l).j) = theta(w(l).i,j * x(l-1).i)
for all i in d(l-1), so all previous layers.
What is the dimension of the matrix of weights W(l) for a layer l?
(d(l-1) + 1) x d(l)
What is the i,j -th entry in W(l)
w(l).i,j
going from node i in previous layer l-1 to node j in layer l
s(l).j =
w(l).i,j * x(l-1).i for all i in d(l-1)
s(l) =
W(l).T * x(l-1)
What is E.in(h) formula in NN?
E.in(h) = 1/N * h(xn;w) - yn) ^2
for all n in N
What is the error on one example (xn, yn) in NN?
e(h(xn), yn) = e(w)