DOC

# Impulse Response Functions, Forecasting with AR and VAR Models

By Evelyn Gomez,2014-04-03 11:41
9 views 0
Impulse Response Functions, Forecasting with AR and VAR Models

Applications of AR Models:

Impulse Response Functions

Forecasting

Consider the first order autoregressive

model:

y = a + ay + ε t01t-1t

where ε is a white noise sequence and the t

stationarity condition,?a?< 1, is satisfied. 1

Then

E(y) = a/(1-a) t01

222

Var(y) = σ/(1-a), σ = Var(ε) t1t

?s?

Corr(y,y) = (a) tt-s1

The OLS estimator is a consistent and asymptotically normal estimator of the a’s.

If, in addition, the ε’s are conditionally homoskedastic, the OLS estimator is asymptotically efficient and for large samples the model can be treated as a standard normal linear regression model for inference purposes.

The impulse response function,

g(s)=?y/?ε, s = 0,1,2,… t+st

specifies the effect of an innovation in period t on y, s periods forward.

Note that for the AR(1) model,

s

g(s) = a 1

[y = a + ay + ε; t01t-1t

y = a + ay+ ε t+101tt+1

= a + a(a + ay + ε) + ε…] 0101t-1tt+1

Note too that the sequence g(0), g(1),… is also the sequence of coefficients in the Wold MA representation of y.

The shape of the impulse response function depends on whether a > 0 or a < 0, but in 11

limg(s)0.either case, since ?a?< 1, 1s(

This is a characteristic of an ergodic stationary process “weak memory” or

“weakly dependent”

The s-step ahead forecast of y formed at

time t is

ss1ˆy(1;a;...;a)a;ay ts,t1101t;

----

y = a + ay + ε , t+101tt+1

ˆyaay;so ; t;1,t01t

y = a + ay + ε, t+201t+1t+2

2ˆˆya;ay(1;a)a;ayso t;ttt2,01101;t1,

and so on.

----

Note that since ?a?< 1, 1

2

ˆlimy(1;a;a;...)aa/(1a)E(y)t;s,t11001ts(

The s-step ahead forecast converges to the unconditional mean as s goes to ?. (This will apply to any stationary process.)

Consider the sequence of s-step ahead

forecast errors

ˆfyy 1,tt;1t;1,tt;1

ˆfyy;a 2,tt;2t;2,tt;21t;1

s

i1ˆfyyas,tt;st;s,t1t;1;si

1

Note that

s2(1)i2a1E(f) = 0, Var(f) = σ(), and s,ts,t1

22

limvar(f)/(1a)var(y)s,t1t (s

Now consider the general AR(p) model

ya;ay;...;ay;t01t1ptpt(*)

E(y) = a/(1-a--a) t01p

Var(y)? Cov(y,y)? Let γ = Cov(y,y). ttt-sstt-s

WLOG, assume a = 0. 0

1. Multiply both sides of (*) by y, take t

expectations and note that ε is t

uncorrelated with y, s > 0: t-s

2

a;...;a;pp 011

2. Multiply both sides of (*) by y and t-1

take expectations:

a;...;a110pp1

….

P+1. Multiply both sides of (*) by y and t-p

take expectations:

a;...;ap1p1p0

This provides a set of p+1 linear equations,

called the Yule-Walker equations, in the p+1 unknowns γ,…,γ which can be solved 0p

2

given a,…,a, and σ. 1pε

Once γ,…,γ have been determined, γ , 0ps

s > p can be determined recursively

a;...;ap;11pp1

Constructing Impulse Response Functions and Forecasting with the AR(p) Model

One Approach Recursive Construction

Consider, for example, the AR(2) model:

ya;ay;ay;t01t12t2t

ˆya;ay;ayt;1,t01t2t1

ˆˆyaayay;;;t2,t01t2t;1,t

ˆˆˆya;ay;ayt;s,t01t;s1,t2t;s2,t

A more efficient approach

st

Rewrite the 2nd order autoregression as a 1

order, 2-dimensional vector autoregression:

yyaaa?????tt1012t;;????? yy0100t1t2~?~?~?~?~?

or, in matrix notation,

Y = A + AY + e t01t-1t

aa?12where Y = [y y]’, A = [a0]’, A =, ttt-100 1?10~?

and e = [ε 0]’. tt

Then, the s-step ahead forecast of Y formed at time t is

ss1ˆY(1;A;...;A)A;Ay ts,t1101t;

and the s-step ahead forecast of y formed at

ˆˆyYtime t, is the first element of . t;s,tt;s,t

Note that this can easily be extended to the

general p-th order case - Y = A + AY + e t01t-1t

where Y = [y y… y]’, A = [a0…0]’, ttt-1t-p+100

e = [ε 0…0]’ are all px1 and A is the pxp tt1

matrix

a a … a a 12p-1p

A = 1 0 … 0 0 1

0 1 … 0 0

0 0 … 1 0

Report this document

For any questions or suggestions please email
cust-service@docsford.com