Lecture 11 Flashcards

1
Q

What is the observation vector?

A

the vector of observed values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the hypothesis vector?

A

the vector of predicted values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the error vector?

A

the vector of signed errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the equation for the error vector?

A

e_i = y_i - H(x_i)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do we rewrite the mean squared error of H?

A

1/n ||y - h||^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How does the linear hypothesis function H(x) = w_0 + w_1x be written as?

A

a matrix of n x 1 where each line is w_0 + w_1x_1, w_0 + w_1x_2, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the notation for the design matrix?

A

X E R^nx2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does the design matrix look like?

A

a matrix of n x 2 where the first column is all 1’s and the second column is n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the parameter vector’s notation?

A

w E R^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the variables in the parameter vector?

A

w_0 and w_1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the mean squared error for the design matrix?

A

R_sq(w) = 1/n ||y - Xw ||^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the equation for the optimal parameter vector?

A

w* = [w_0* w_1*]^T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the minimizer of R_sq(w)?

A

argmin||e||

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What do we need to know about X^TX in order for it to be invertible?

A

it has to be full rank

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does full rank mean?

A

all columns are linearly independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What happens if X^TX is not full rank?

A

there are infinitely many solutions to the normal equations

17
Q

What is the input w* to error(w)?

A

the one that satisfies the normal equations

18
Q

What is the unique solution if X^TX is invertible?

A

w* = (X^TX)^-1X^Ty

19
Q

What is a function of a vector?

A

it is a function of multiple variables, which are the components of the vector

20
Q

What is the gradient of R_sq(w) with respect to w?

A

the vector of partial derivatives where w_0, w_1, …, w_d are the entries of the vector w

21
Q

What is the equivalent equation for R_sq(w) = 1/n ||y - Xw||^2?

A

1/n (y - Xw)^T (y - Xw)

22
Q

What does (AB)^T equal?

A

B^T A^T

23
Q

What does d/dw (y * y) =?

A

0