Forecasting Flashcards
minimum mean square, model based forecasting
What is the purpose of forecasting?
- given data X1,X2,…,Xn we want to:
- -predict a future value Xn+l which is l steps ahead with a forecast Xn(l)
- -have a confidence interval for Xn(l)
- -ensure that the forecast error is minimised
Forecast Notation
Xn(l) is a forecast for the future value Xn+l given data X1,X2,…,Xn
Forecast Error
en(l) = Xn+l - Xn(l)
Minimum Mean Square Error
S = E[ (Xn+l - Xn(l))² ]
Minimum Mean Square Error Forecast
Xn(l) = βl εn + βl+1 εn-1 + βl+2 εn-2 + …
Minimum Mean Square Error Forecast
Steps
-write Xt as an MA model
-express Xn+l in this form
-suppose Xn(l) can be written as a linear combination of the innovations {εt} up to time n:
Xn(l) = βo’ εn + β1’ εn-1 + …
-calculate the expectation of the square error, recalling that E(X²)=Var(X) when E(X)=0
-minimise this to get the forecast
Updating Forecasts
Description
- given data up to time n, we would predict Xn+l with Xn(l)
- one time step later when we now know Xn+1 we would predict Xn+l with Xn+1(l-1)
Updating Forecasts
Relationship Between Forecasts
-can show that, with βo=1:
Xn+1(l-1) = Xn(l) + βl-1 [Xn+1 - Xn(l)]
Estimating Residuals for AR(p) Processes
-for AR(p) processes: Xt = Σ αk Xt-k -the residuals satisfy: et = Xt - Σ αk^ Xt-k, t≥p -sums from k=1 to k=p
Estimating Residuals for MA(q) Processes
-for an MA(q) process: Xt = εt + Σ βk εt-k -sum from k=1 to k=q -the residuals are computed recursively: e1 = X1 e2 = X2 - β1^ e1 en = Xn - Σβi^ ei -sum from i=1 to i=n-1, n≥2
Minimum Mean Square Error
Error as an MA Process
en(l) = βoεn+l + β1εn+l-1 + … + βl-1εn+1
- an MA(l-1) process
- this means that en(l) are uncorrelated
Minimum Mean Square Error
E[en(l)]
-since E[εt]=0 and en(l) can be written as an MA(l-1) process in terms of εt it follows that:
E[en(l)] = 0
Minimum Mean Square Error
Var[en(l)]
Var[en(l)] = σε² Σ βk²
-sum from k=0 to k=l-1
Minimum Mean Square Error
Cov[en(l), εn-k]
Cov[en(l), εn-k] = 0
- for all k≥0
- the forecast error is uncorrelated with the forecast itself
Minimum Mean Square Error
Cov[en(l), en(l+m)]
Cov[en(l), en(l+m)]
= σε² Σ βk βk+m
-sum from k=0 to k=l-1, valid for m>0
Minimum Mean Square Error
Relationship Between Xn(1) and Xn for AR(1) Processes
Xn(1) = α Xn
Minimum Mean Square Error
Relationship Between Xn(l) and Xn for AR(1) Processes
Xn(l) = α^l Xn
Forecast Error as Conditional Mean
Description
-we can also think of the forecast Xn(l) as the mean of Xn+l given known values X1,…,Xn:
E[Xn+l | X1,…,Xn]
Forecast Error as Conditional Mean
E[εt | X1,…,Xn]
E[εt | X1,…,Xn] = {0 for t>n, εt for t≤n
-future values are known and past values are known
Forecast Error as Conditional Mean
E[Xn+l, X1,…,Xn]
E[Xn+l, X1,…,Xn] = Xn(l)
Forecast Error as Conditional Mean
Var[Xn+l, X1,…,Xn]
Var[Xn+l, X1,…,Xn] = Var[en(l)]
Forecast Error as Conditional Mean
Confidence Interval
-assuming Gaussian white noise, the approximate 95% prediction interval limits are:
Xn(l) ± 1.96 * √[βo²+β1²+…βl-1²)σε²]
Forecast Error as Conditional Mean
E[Xt | X1,…,Xn]
E[Xt | X1,…,Xn] = Xn (t-n) if t>n or Xt if t≤n
Model Based Forecasting
Description
-an alternative approach to forecasting is to consider the model for time t after the last observation
Model Based Forecasting
Steps
-imagine having data X1,…,Xn for times t=1,…,n
-assume that a model has been fitted then Xn+1 can be written in the format of the model where the data Xn,Xn-1 etc are known, the parameters α, β etc. are estimated and the value of εt for t>n is not known but the variance σε² can be estimated from the data
-can calculate the conditional expectation and variance of Xn+l using the known data
-if Xn+l is normally distributed then the 95% confidence interval for the estimate is:
E[Xn+l] ± 1.96 σε²
Model Based Forecasting
AR(2) - Xn+l
Xn+l = α1 Xn + α2 Xn-1 + εn+1
Model Based Forecasting
AR(2) - E[Xn+1]
E[Xn+1 | Xn,Xn-1] = α1 Xn + α2 Xn-1
Model Based Forecasting
AR(2) - Var[Xn+l]
Var[Xn+l] = σε²
Model Based Forecasting
AR(2) - CI
α1^ Xn + α2^ Xn-1 ± 1.96 σε^²
Model Based Forecasting
AR(1) - E
E[Xn+1] = α Xn
E[Xn+2] = α² Xn
…
E[Xn+l] = α^l Xn
Model Based Forecasting
AR(1) - Var
Var[Xn+1] = σε²
Var[Xn+2] = (1+α²) σε²
…
Var[Xn+l] = (1+α²+…+α^(2(l-1))) σε²