Adaptive basis function models Flashcards

1
Q

What is a basis function and give an example of when we can use domain knowledge to employ one to a model?

A

A map from the input space to the reals. E.g. gravitation. If we were trying to learn Force from a dataset of masses and radii, we could construct a basis function m1m1/r which would be a better feature.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Explain the aim of an adaptive basis function model

A

We want to learn basis functions from the data. Our prediction functions will be linear combinations of our basis functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In an adaptive basis function model, what is the empirical risk we wish to minimise?

A

The sum of losses for a function in the hypothesis space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Can we use gradient descent to minimise this risk in general?

A

In general, no. Only if each basis function is parameterised by weights and the objective function is differentiable with respect to those weights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Describe the algorithm of forward stagewise additive modelling.

A

Start with some set of weak/base learners
- Initialise prediction function
- Update with the linear combination of basis functions which minimise the loss
continue

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are some possible weak learners?

A

Shallow trees

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Explain gradient boosting

A

The update step can be learned by noting that the objective function only depends on the prediction at the training points. We can take the gradient of J with respect to f_i for each point, and this would give the set of points that our update sum of basis functions should be closest to. Then we just minimise the difference over our hypothesis space and use this as the update step.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly