C2 Flashcards
perceptron training for classification task
try to find suitable values for the weights in such a way that the training examples are correctly classified
case of two classes: try to find a hyper-plane that separates the examples into these two classes (linearly separable if there exists a hyperplane that separates them)
perceptron learning algorithm
initialize weights w randomly
while (there are misclassified training examples):
select a misclassified example (x,d)
w_new = w_old + etadx
d = difference between the target value and the predicted value
if x is misclassified and d = 1, wx should be bigger
if x is misclassified and d = -1, wx should be smaller
Cover’s theorem
in a highly dimensional space, if the number of points is relatively small compared to the dimensionality and you paint the points randomly in two colors, the data set will be linearly separable
- if the number of points in a d-dimensional space is smaller than 2*d, they are almost always linearly separable
- if the number of points in a d-dimensional space is bigger than 2*d, they are almost always NOT linearly separable
support vector machines
the decision boundary should be as far away from the data of both classes as possible (maximize the margin m)
performs very well on highly dimensional data
computationally very expensive
perceptron
neural network without hidden layer
> > momentum, Nesterov accelerated gradient??