Topic 3: The Geometry of Linear Regression Flashcards

1
Q

What is the difference between the following?

A
  • RN is a set of n-vectors.
  • More operations (like scalar products and length) are defined on Euclidean space.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the meaning of this operation?

A

Called the scalar or inner product of the two vectors.

Note that this operation is communative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the meaning of the following operation?

A

The length of vector x.

In euclidean space this is calculated by:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How is the crossproduct of two vectors related to the angle between them?

A

Note that cos θ is 1 when the vectors are parralel, and zero when they are at right angles.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does <x,y> = 0 imply?

A

That if x & y are nonzero vectors, they are at right angles.

They are said to be orthogonal, or perpendicular.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the Cauchy-Schwartz inequality?

A

Shown as below, which follows from the relationship between cross products and length, and the knowledge that the cosine of an angle is between -1 & 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a subspace? How can it be defined?

A

Can be defined by a set of basis vectors.

The subspace associated with a set of basis vectors is denoted S(x1, … sk),

The basis vectors are said to span the subspace, which is k dimensional in general.

The subspace consists of every vector that can be formed as a linear combination of the basis vectors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is an orthogonal complement?

A

Denoted as below, a subspace defined by a set of basis vectors that are orthogonal to everything in another (given) subspace.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Give a diagram showing the geometry of a linear regression model, as taken straight from the population. i.e. Parameters, not estimates. Explain.

A

This diagram follows from the two dimensional case.

This is because:

yi = XBi + ui

Where yi is the ith row of column vector y, etc.

So for the x axis, the width of the y vector (the first observation of y) is equal to the width of XB plus the width of u. So to for the y axis. For every observation, another dimension is added.

Remember that XB1 need not be zero, (this is simply viewing the triangle from the perspective of XB).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the meaning of the coordinates of a point in the subspace (X), where X is the matrix of variables from a regression model.

A

The coordinates are the values of the different explanitory variables in one observation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What subspace does the n-vector Xβ belong to?

A

The subspace of X, as postmultiplying by the column vector β is mearly picking a different linear combination of the columns of X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the geometric intepretation of the method of moments condition below?

A

Each row of this equation is a scalar product.

From the rule for selecting a single row of a matrix product, we get the following.

Where xi is the ith column of X. (Such that when transposed we have selected the row.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Use a diagram to explain the Geometric interpretation of the method of moments estimation, with three observations.

A

This means we have two regressors.

Note that figure a) has three dimensions for all three observations, and that though x1 and x2 lie flat on the viewing angle, y is certainly not the only vector with a non-zero third observation. (Changing the viewing angle of this diagram does not affect the relationship between the variables or the estimation process.)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the implication of the following diagram, that follows from the relationship between the length of the vectors?

A

Total sum of squares = Explained sum of squares + Sum of squared residuals.

TSS = ESS + SSR

From:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Explain descriptively the orthogonal projection.

A

Maps points onto the closest point of the target subspace.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the formula for an orthogonal projection?

A
17
Q

Show that calculating estimates of y with the method of moments estimator is the same projecting y orthogonally onto X.

A
18
Q

What is the result of PXXb,

Where PX is an orthogonal projection matrix and b is a column vector of arbitrary values.

A

PXX = X

So PXXb = Xb

As a new column vector made from linear combinations of the columns of X will be on the subspace X, and hencewill not be affected by the orthogonal projection (the closest point to Xb on the X subspace is itself).

19
Q

What is the formula for MX?

What does it yield?

A

MX = I - PX

Projects to the orthogonal complement of subspace X.

Can be thought of as calculating the vector showing the displacement between the projected and original point. In MM this is would be a vector of the error terms.

20
Q

What are some proporties of orthogonal projections?

A
  1. Idempotent.
  2. Symmetric.
  3. Complementaries Annihilate.
21
Q

What is the result of PXMX?

A

0,

A feature of all complementary projections for which:

Px + MX = I

22
Q

What is the idempotent properties of orthogonal projections?

A

That PxPx = Px

and MxMx = Mx

23
Q

What is the result of MX(Xb)

A

0,

Which follows from MX = I - PX

24
Q

For any vectors w and z,

What is the result of <PXw , MXw>?

A

Zero.

25
Q

How can y be decomposed orthoganaly?

A

By the equation:

y = PXy + MXy

26
Q

What are the ranks of the following terms?

y = PXy + MXy

A

X when full rank is k.

y is of rank n.

MXy then is of rank n-k.

27
Q

What is a non-singular linear transformation?

What are the properties of a matrix transformed as such.

A

The multiplication of X, a n by k nonsingular matrix by A, another nonsingular matrix that is k by k.

XA has the same subspace as X.

(But not AX).

28
Q

What changes to a regresion when a non-singular linear transformation is performed on the regressors.

i.e.

y = XAβ + u

A

XA is on the same subspace as X.

While β will change, the residuals and the fitted values will not, as these are dependent only on the projection of y onto X / XA.