L5_KRR Flashcards

1
Q

Kernel-Methoden können implizit beliebig komplexe versteckte Ebenen modellieren

A

Kernel methods can implicitly model arbitrarily complex hidden layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Multilayer Perceptrons (MLP) can

A

learn linearly non separable problems

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Universal Approximation Theorem

A

With enough intermediate variables, a network with a single hidden layer can approximate any reasonable function of the input.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Kernelizing linear methods

A
  1. Map the data into a (high dimensional) feature space

2. Look for linear relations in the feature space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Kernel Trick

A

Any algorithm for vectorial data that can be expressed only in terms of scalar products between vectors can be performed implicitly in the feature space associated with any kernel, by replacing each scalar product by a kernel evaluation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A big problem with high dimensional features spaces

A

When the dimensionality increases, the volume of the space increases so fast that the available data becomes sparse. The amount of data needed for a reliable result often grows exponentially with the Dimensionality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What about the curse of dimensionality?

Representer Theorem

A

In a regularized learning problem, the optimal weights w in feature space are a linear combination of the training examples in feature space φ(xi ): sum αi φ(xi )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Kernel methods are memory-based methods: 3

Kernels as Similarity Measures

A
  • store the entire training set
  • define similarity of data points by kernel function
  • new predictions require comparision with previously learned examples
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

When do humans percieve stimuli as similar?

A

Perceptual Similarity of new data x decays exponentially with distance from prototype μ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Kernel Methods - Pros

A

+ Powerful Modeling Tool
(non-linear problems become linear in kernel space)
+ Omnipurpose Kernels
(Gaussian works well in many cases)
+ Kernel methods can handle symbolic objects
+ When you have less data points than your data has dimensions kernel methods can offer a dramatic speedup

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Kernel Methods - Cons

A

– Difficult to understand what’s happening in kernel space
– Model complexity increases with number of data points
→ If you have too much data, kernel methods can be slow

How well did you know this?
1
Not at all
2
3
4
5
Perfectly