Backpropagation and SVM Flashcards
What is backpropagation?
a neural network learning algorithm
What is a neural network?
set of connected input/output units in which each connection has a weight associated with it
How are neural networks also called?
connectionist learning
What are some advantages of neural networks
high tolerance of noisy data as well as their ability to classify patterns on which they have not been trained. can be used when relationship between attributes and classes is not known, well suited for continuous valued inputs and ouputs
What is a multilayer feed foward neural network?
consists of an input layer, one or more hidden layers and an output layer.
What do output unit takes as input?
weighted sum of the outputs from units in the previous layer.
Are inputs to multilayer feedfoward neural network normalize?
Yes to help up speed learning phase
How are output units decided based on classification labels?
One output unit per class
How does backpropagation work?
learns by iteratively processing a data set of training tuples, comparing the networks prediction with the real value. The weights are modified to minimize the MSE. Modifications made in backwards direction through each hidden layer hence the name
What happens to the weights in backpropagation?
In the majority of the cases it will converge and the learning process stops
What is case updating?
updating weights and biases after the presentation of each tuple
What is epoch updating?
weights and biases accumulated in variables so that weights and biasese are updated after all the tuples in the training set had been presented
What is an epoch?
One iteration through the training set
What are the three main terminating conditions in a NN?
delta(weights) in the previous epoch are so small below some specified threshold, percentage of tuples misclassified in the previous epoch is below a threshold (Set), prespecified number of epochs has expired
What are Support Vectro Machines (SVMS)
method for the classification of both linear and nonlinear data