Lecture 4 Flashcards

1
Q

State the relation between two sufficient conditions for UI.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

State and prove the relation between two sufficient conditions for UI.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Are transformations of independent variables also independent?

Are transformations of uncorrelated variables also uncorrelated?

A
  1. Transformations of independent variables are also independent.
  2. Transformations of uncorrelated variables are NOT also uncorrelated.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

State formally and intuivitely the condition for martingale sequences.

A

Intuitively: Expectation with relation to the past is 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

State the relationship between indepence, martingale differences and uncorrelation.

A
  1. Independence implies martingale difference.
  2. Martingale difference implies uncorrelated.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

State and prove the relationship between uncorrelation and martingale difference sequence.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Formally state the WLLN for independent UI sequences.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Formally state and prove the WLLN for independent UI sequences.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

State a relaxation for one of the assumptions of the WLLN for independent UI sequences.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do you use WLLN for independent UI sequences when expected mean is not equal to 0? Provide all the details.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

State Khinchin’s WLLN.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

State and prove Khinchin’s WLLN.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

State Kolmogorov’s SLLN.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe how we can still use the SLLN if we relax the identical assumption.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Describe why we cannot use Khinchin’s Theorem in multiple regression with deterministic z_i.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Describe why we cannot use Chebyshev’s Theorem (Th 7) in multiple regression with deterministic z_i.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Show the consistency of beta hat in multiple regression with deterministic z_i.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Formally define a generalized linear process.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Clarify in words the difference (and relationship) between a linear process and a generalized linear process.

A

In the linear process, we assume the innovations e_i have second mean and are uncorrelated.

In GLP, we simply assume that e_i is UI. Therefore a linear process is by definition a GLP, since the assumptions on linear processes imply that they are uniformly integrable.

20
Q

Formally state the theorem regarding the convergence in first mean of GLP.

A
21
Q

Formally state and prove the theorem regarding the convergence in first mean of GLP.

A
22
Q

Give an example of regressors z_i such that Q_n / n does not have a finite limit.

A
23
Q

State the theorem relating the convergence of Beta hat in second mean and the eigenvalues of the Q matrix.

A
24
Q

State and prove the theorem relating the convergence of Beta hat in second mean and the eigenvalues of the Q matrix.

A
25
Q

Relate the min eigenvalue of a matrix to it’s elements.

A
26
Q

State another condition that sufficies for Beta hat n consistency.

A
27
Q

State and prove another condition that sufficies for Beta hat n consistency.

A
28
Q

Describe the relationship between second moment and probability, in cases where second moment exists and doesn’t exist.

A
29
Q

Relate the relationship between second moment and probability to the two different conditions for Beta hat consistency in LSE.

A
30
Q

Show that y_i = alpha + Beta x i + u_i is consistent.

A
31
Q

Give an example of a linear regression that is not consistent and show why it is not consistent.

A
32
Q

Define O(f_n) and o(f_n)

A
33
Q

Define Op(f_n) and op(f_n)

A
34
Q

Give a simpler definition of op(1).

A
35
Q

State two relationships between convergence in probability to a constant and stochastic order of magnitude.

A
36
Q

State and prove the two relationships between convergence in probability to a constant and stochastic order of magnitude.

A
37
Q

Provide a more intuitive definition of the difference between Op() and op()

A
38
Q

State the three results relating different stochastic orders of magnitude.

A
39
Q

Which is the direction of implication between Op(fn) and op(fn)? Prove it.

A
40
Q

State and prove the relationship between stochastic orders of magnitude relating fn and gn if fn/gn goes to 0.

A
41
Q

State and prove the relationship between stochastic order of magnitude and rate of convergence provided a moment exists.

A
42
Q

State the 4 results relating the stochastic orders of magnitudes of combinations of RVs.

A
43
Q

State and prove the stochastic order of magnitude of multiplication of two RVs. (for big and little o)

A
44
Q

State and prove the stochastic order of magnitude of the sum of two RVs. (for big and little o)

A
45
Q

State and prove the relationship between the multiplication of RVs that are Op(fn) and op(gn).

A
46
Q

What is the rate of convergence of Beta hat in LSE? Show it.

A