Homework 2 Flashcards
Nesterov`s accelerated gradient scheme is often more efficient than the simple momentum scheme because the weighting factor of the momentum term increases as a function of iteration number.
False
Weight decay helps against overfitting.
True
Pruning helps against overfitting.
True
L_1 regularisation reduces small weights less than L_2 regularisation.
False
For the parity problem with N inputs, one can construct a perceptron that solves the problem with less than
2^N hidden neurons.
True
Using a stochastic path through weight space in backpropagation assures that the energy either decreases or stays constant.
False
The number of N-dimensional Boolean functions is
2^N.
False
How the weights are initialised for backpropagation does not matter (provided that they are not all zero) because one usually iterates for many epochs so that the initial conditions are forgotten.
False
When solving a t=+1/−1 problem in two dimensions using a decision boundary in the form of a convex polygon, the resulting output problem may sometimes not be linearly separable.
False