lecture 5: least squares and linear regression Flashcards

1
Q

what is a function

A

a relation that associates each element x of a set X, the domain of the function, to a single element y of another set Y, the codomain of the function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what kinds of arguments can scalar and vector functions have?

A

both can have both scalar and vector arguments

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what does a scalar-valued function of d-vectors do?

A

it maps real d-vectors to real numbers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

a linear function has to satisfy which 2 properties

A

homogeneity and additivity, or in other words, superposition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what does homogeneity mean

A

scaling the vector argument is the same as scaling the function value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what does additivity mean

A

adding vector arguments is the same as adding the function values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

a linear function is affine if and only if

A

it can be expressed as f(x) = aᵀx + b for some d-vector a and some scalar b, b is called the offset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

graphically, an affine function will have a

A

y-intercept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

f(x) has a local minimum at x = c if

A

f(x)≥f(c) for every x in some open interval around x = c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what is an open interval

A

it is an interval that does not include its endpoints, denoted with parentheses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

the global minima is

A

the minimum value among all the local minima

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what does argmax f(a) do

A

it returns the element of set a that maximises f(a)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

differentiating a scalar function w.r.t. a d x 1 vector results in a

A

d x 1 vector

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

differentiation of a vector function of size h x 1 with respect to a d x 1 vector results in a

A

h x d matrix

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what is the purpose of a learning objective function

A

to find the optimal values of w and b which minimises ∑(f(x) - y)²
the loss function in this case is squared error loss, and the cost function is the average squared error losses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what is the formula w using least squares regression when the objective function is e = Xw - y

A

w = (XᵀX)⁻¹Xᵀy OR Xᵀ(XXᵀ)⁻¹y

then predict y using y = Wx