SVM and Neural Networks Flashcards

1
Q

What are the key attributes of an SVM?

A
  • Sparseness

- Convexity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Briefly, how does SVM work?

A

We want to separate data linearly.
- Define a margin (distance from line to the closest point
on each side)
- Maximize the margin, which will give you the largest
possible separation
- Support Vectors are the data points pushing on the
margin

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What can you do about noisy datasets in SVM?

A

Introduce a slack variable in the margin. This makes the margin be soft instead of hard. If the margin is hard, no noise/outliers are allowed and the data can be mis-represented.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What can you do about non-linear datasets?

A

Can always map them to a higher dimension where it is linearly separable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the motivation behind neural networks?

A

Based on human neurons and how they operate. Humans can solve complex problems quickly despite our neurons having a slow transmission time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a perceptron?

A

The building block of a neural network, based on a neuron. Performs as a binary classifier if given a n-dimensional vector as input.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a MLP (Multi Layered Perceptron)?

A

Several layers of perceptrons connected. They function as universal approximators, and allow modeling of non-linear discriminant functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is underfitting/overfitting?

A

Underfitting is when you have not trained the network enough, and it has issues recognizing things it should.
Overfitting is when you have overtrained, and the network only recognizes things it has previously trained on.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is back-propagation?

A

The basic method of a simple neural network.
Three phases:
- Forward propagation (basically sends data forwards)
- Backward propagation (sends data backwards)
- Weight updating (looks at the data and makes
changes based on results)

The convergence velocity depends on three things:

  • Complexity of the problem
  • Complexity of approximating function
  • Learning rate
How well did you know this?
1
Not at all
2
3
4
5
Perfectly