Forecasting Flashcards

minimum mean square, model based forecasting

1
Q

What is the purpose of forecasting?

A
  • given data X1,X2,…,Xn we want to:
  • -predict a future value Xn+l which is l steps ahead with a forecast Xn(l)
  • -have a confidence interval for Xn(l)
  • -ensure that the forecast error is minimised
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Forecast Notation

A

Xn(l) is a forecast for the future value Xn+l given data X1,X2,…,Xn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Forecast Error

A

en(l) = Xn+l - Xn(l)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Minimum Mean Square Error

A

S = E[ (Xn+l - Xn(l))² ]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Minimum Mean Square Error Forecast

A

Xn(l) = βl εn + βl+1 εn-1 + βl+2 εn-2 + …

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Minimum Mean Square Error Forecast

Steps

A

-write Xt as an MA model
-express Xn+l in this form
-suppose Xn(l) can be written as a linear combination of the innovations {εt} up to time n:
Xn(l) = βo’ εn + β1’ εn-1 + …
-calculate the expectation of the square error, recalling that E(X²)=Var(X) when E(X)=0
-minimise this to get the forecast

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Updating Forecasts

Description

A
  • given data up to time n, we would predict Xn+l with Xn(l)

- one time step later when we now know Xn+1 we would predict Xn+l with Xn+1(l-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Updating Forecasts

Relationship Between Forecasts

A

-can show that, with βo=1:

Xn+1(l-1) = Xn(l) + βl-1 [Xn+1 - Xn(l)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Estimating Residuals for AR(p) Processes

A
-for AR(p) processes:
Xt = Σ αk Xt-k
-the residuals satisfy:
et = Xt - Σ αk^ Xt-k, t≥p
-sums from k=1 to k=p
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Estimating Residuals for MA(q) Processes

A
-for an MA(q) process:
Xt = εt + Σ βk εt-k
-sum from k=1 to k=q
-the residuals are computed recursively:
e1 = X1
e2 = X2 - β1^ e1
en = Xn - Σβi^ ei
-sum from i=1 to i=n-1, n≥2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Minimum Mean Square Error

Error as an MA Process

A

en(l) = βoεn+l + β1εn+l-1 + … + βl-1εn+1

  • an MA(l-1) process
  • this means that en(l) are uncorrelated
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Minimum Mean Square Error

E[en(l)]

A

-since E[εt]=0 and en(l) can be written as an MA(l-1) process in terms of εt it follows that:
E[en(l)] = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Minimum Mean Square Error

Var[en(l)]

A

Var[en(l)] = σε² Σ βk²

-sum from k=0 to k=l-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Minimum Mean Square Error

Cov[en(l), εn-k]

A

Cov[en(l), εn-k] = 0

  • for all k≥0
  • the forecast error is uncorrelated with the forecast itself
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Minimum Mean Square Error

Cov[en(l), en(l+m)]

A

Cov[en(l), en(l+m)]
= σε² Σ βk βk+m
-sum from k=0 to k=l-1, valid for m>0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Minimum Mean Square Error

Relationship Between Xn(1) and Xn for AR(1) Processes

A

Xn(1) = α Xn

17
Q

Minimum Mean Square Error

Relationship Between Xn(l) and Xn for AR(1) Processes

A

Xn(l) = α^l Xn

18
Q

Forecast Error as Conditional Mean

Description

A

-we can also think of the forecast Xn(l) as the mean of Xn+l given known values X1,…,Xn:
E[Xn+l | X1,…,Xn]

19
Q

Forecast Error as Conditional Mean

E[εt | X1,…,Xn]

A

E[εt | X1,…,Xn] = {0 for t>n, εt for t≤n

-future values are known and past values are known

20
Q

Forecast Error as Conditional Mean

E[Xn+l, X1,…,Xn]

A

E[Xn+l, X1,…,Xn] = Xn(l)

21
Q

Forecast Error as Conditional Mean

Var[Xn+l, X1,…,Xn]

A

Var[Xn+l, X1,…,Xn] = Var[en(l)]

22
Q

Forecast Error as Conditional Mean

Confidence Interval

A

-assuming Gaussian white noise, the approximate 95% prediction interval limits are:
Xn(l) ± 1.96 * √[βo²+β1²+…βl-1²)σε²]

23
Q

Forecast Error as Conditional Mean

E[Xt | X1,…,Xn]

A

E[Xt | X1,…,Xn] = Xn (t-n) if t>n or Xt if t≤n

24
Q

Model Based Forecasting

Description

A

-an alternative approach to forecasting is to consider the model for time t after the last observation

25
Q

Model Based Forecasting

Steps

A

-imagine having data X1,…,Xn for times t=1,…,n
-assume that a model has been fitted then Xn+1 can be written in the format of the model where the data Xn,Xn-1 etc are known, the parameters α, β etc. are estimated and the value of εt for t>n is not known but the variance σε² can be estimated from the data
-can calculate the conditional expectation and variance of Xn+l using the known data
-if Xn+l is normally distributed then the 95% confidence interval for the estimate is:
E[Xn+l] ± 1.96 σε²

26
Q

Model Based Forecasting

AR(2) - Xn+l

A

Xn+l = α1 Xn + α2 Xn-1 + εn+1

27
Q

Model Based Forecasting

AR(2) - E[Xn+1]

A

E[Xn+1 | Xn,Xn-1] = α1 Xn + α2 Xn-1

28
Q

Model Based Forecasting

AR(2) - Var[Xn+l]

A

Var[Xn+l] = σε²

29
Q

Model Based Forecasting

AR(2) - CI

A

α1^ Xn + α2^ Xn-1 ± 1.96 σε^²

30
Q

Model Based Forecasting

AR(1) - E

A

E[Xn+1] = α Xn
E[Xn+2] = α² Xn

E[Xn+l] = α^l Xn

31
Q

Model Based Forecasting

AR(1) - Var

A

Var[Xn+1] = σε²
Var[Xn+2] = (1+α²) σε²

Var[Xn+l] = (1+α²+…+α^(2(l-1))) σε²