Neural networks Flashcards
what is the principle of backpropagation
The idea is to propagate the signal pf the hidden layer , where the error is computed at each output allowing to adjust the weight accordingly
what do we use to minimize the criterion for regression
Generally, we either use gradient descent or ascent to allow the minimization
what are the main component of a neural network
intput layer hidden layer(s) output layer
what is the global purpose of NN
all the connected weights are adapted to optimize a supervised criterion which is an iterative optimization based on the gradient ascent/ descent
what is an error criterion
A non linear function of the weights with numerous local minima
what is the difference between global and local approach
for Gloabl apprach which correspond to supevised problems it means that all the connections are adapted to optimize a supervised criterion and the optimization is based on criterion
for local approach unsupervised problem meaning adaptation of reduced number of weights