Untitled Deck Flashcards

1
Q

What is the main goal of Support Vector Machines (SVMs)?

A

To find an optimal separator between two classes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What method do SVMs use for separation?

A

Linear separation by means of hyperplanes and kernel functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a hyperplane in the context of SVMs?

A

A hyperplane divides a ( p )-dimensional space into regions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How is a hyperplane formally defined?

A

(\beta_0 + \beta_1x_1 + \beta_2x_2 + \ldots + \beta_px_p = 0).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the maximal margin classifier?

A

It finds the hyperplane with the largest margin, which is the minimum perpendicular distance to the closest examples.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are support vectors?

A

The examples that are equidistant from the hyperplane.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the role of the ‘margin’ in SVMs?

A

It defines the distance between the hyperplane and the nearest data points from both classes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Why do we prefer to label examples with ( y = 1 ) or ( y = -1 )?

A

To increase the distance between classes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What problem arises when a separating hyperplane does not exist?

A

A support vector classifier with a soft margin is used.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does a soft margin allow in SVMs?

A

It allows some misclassification of examples by tolerating points within the margin.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the impact of a large ‘C’ hyperparameter in SVMs?

A

More examples can violate the margin, leading to higher bias and lower variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How do SVMs handle nonlinear boundaries?

A

By transforming the feature space using kernel functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the purpose of kernel functions in SVMs?

A

To transform the data into a higher-dimensional space where a linear separation is possible.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the most common types of kernel functions?

A

Polynomial, radial basis function (RBF), and sigmoid.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Write the equation of a polynomial kernel.

A

( K(x, y) = (x \cdot y + 1)^d ).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Write the equation of a radial basis function (RBF) kernel.

A

( K(x, y) = \exp\left(-\frac{||x \cdot y||^2}{2\sigma^2}\right) ).

17
Q

What is the main idea behind multiclass classification in SVMs?

A

To use multiple SVMs to distinguish between more than two classes.

18
Q

What are the two common strategies for multiclass SVM classification?

A

One-versus-all and one-versus-one approaches.

19
Q

What is the one-versus-all strategy?

A

A single SVM is trained for each class against all other classes.

20
Q

What is the one-versus-one strategy?

A

Multiple SVMs are trained for each pair of classes, and a voting scheme is used for classification.

21
Q

What type of problem is solved by the support vector classification problem?

A

An optimization problem involving support vectors and similarity measures.

22
Q

How do SVMs replace the inner product in the optimization problem?

A

By using kernel functions to compute more flexible decision boundaries.

23
Q

Why are kernel functions essential for complex problems?

A

They enable SVMs to handle data that is not linearly separable in the original space.

24
Q

What is the significance of the margin in SVMs?

A

A larger margin improves the generalization ability of the classifier.

25
Q

What are the key advantages of SVMs?

A

They provide flexible decision boundaries and handle high-dimensional data well.

26
Q

Why is SVM considered a nonparametric method?

A

Because it does not assume a fixed form for the underlying function.

27
Q

How are coefficients in SVM estimated?

A

Based on the support vectors by solving a large system of equations.

28
Q

What is a major application area for SVMs?

A

Speech recognition, where they classify sounds into phonemes.

29
Q

How do SVMs improve the robustness of speech recognition systems?

A

By providing optimal separation and generalization across different speakers.

30
Q

Summarize the conclusion of SVMs.

A

SVMs find linear separators with the largest margin, and kernel functions help classify data in complex, high-dimensional spaces.