Chapter 2 Flashcards

1
Q

Define stochatic process or random process

A

Collection of Random variables X(t)defined on the common probability space , indexed by elements of parameter set T for time. The set of all possible values for X is called the state space for T
X is a fucntion in 2 variables, t and omega

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Explain the trageory of a random process

A

The realisation or sample function/ path or the random process. For a fixed value of omega the function X(t) for all t is the trajectory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Give an example of a stochastic process with a continuous state space and continuous time

A

Euro/ dollar exchange rate over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give an example of a stochastic process with a continuous state space and discrete time

A

Annual inflation rates

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Give an example of a stochastic process with a discrete state space and discrete time

A

Number of earthquakes over a year

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

If two variables X and Y are independent what can be said about their conditional expectation

A

The Expectation of X|Y is just expectation of X is X and Y are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When do RVs have the same probability distribution

A

IFF for any bounded measurable function expectation of f(x) is the same as the expectation of F(y) then we write X==^d Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When is a stochastic process said to be a white noise

A

X(t) is a white noise if all X(t)s are iid we use notation Xn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do we know when Xn white noise is symmetric

A

The distribution of Xn equals the distribution of -Xn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are increments of steps of a random walk

A

Xi terms for all n that sum up to get the random walk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

If a random walk is not a symmetric random walk what can it be called?

A

Biased

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does Sn/n tend to as n tends to infinity when Sn is a random walk

A

It tends towards E(xi) = mew because of law of large numbers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Why is the central limit theorem a stronger result than the law of large numbers?

A

It is dealing with the converge of the distribution not just a central value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe the gamblers ruin game set up

A

Each player has a starting capital. If A RV X is 1 - player s wins 1 unit if X is -1 - player f wins one unit. Game continues until one party loses all their capital.
Sn = x1+x2+… models amount won/lost by a player.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define p

A

Probability Xt = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define q

A

Probability Xt = -1

17
Q

Define a stationary process

A

A process X is stationary if for any t the joint distribution of the random vectors: (X(t1),X(t2)….X(tk)) and (X(t1+t),X(t2+t)….X(tk+t)) are identical. This implies Xr and Xs have the same distirbution and so therefore the expectation and variance of Xt is constant, not dependent on t.

18
Q

Is a random walk stationary?

A

No except in the degenerate case when Xi=0

19
Q

Is white noise stationary process?

A

Yes as all XN’s are iid

20
Q

What is the difference between the weakly stationary and strong stationarity definitions

A

It doesn’t say anything about the distributions

21
Q

Does White noise have independent increments?

A

No except where the Xns are not random in degenerate case - Increment rvs are dependent

22
Q

Does random walk process have independent increments?

A

Yes S1-S0=X1, S2-S1=X2 … etc are independent

23
Q

Explain the Markov property

A

A sequence of RVs Zn have the markov property if for any n:
P(Zn+1=z|Z0=z0,…,Zn=zn) = P(Zn+1=z|Zn=zn) . The next RV int he sequence is only dependent on the most recent value in the sequence.

24
Q

Does Random walk have a markov property?

A

Yes

25
Q

If a process has the markov property what does this mean if we want to find a probability k steps ahead of us?

A

We only have to condition on the most recent stage of the sequence

26
Q

Is white noise a martingale?

A

White noise is not a martingale it is a martingale only in degenerate case where Xn is a constant

27
Q

Is a random walk a martingale?

A

No unless it is an unbiased random walk!

28
Q

Define an increasing random process

A

Process that for any W X(t) is an increasing function: ie. all trajectories are increasing

29
Q

Define a continuous random process

A

Process that for any w, X(t) is a continuous function: ie. all trajectories are continuous