Time Series (Stochastic Process) - Iowa State University

By Pauline King,2014-03-21 15:06
6 views 0
Time Series (Stochastic Process) - Iowa State UniversityIowa,State,Time,time,IOWA,STATE

Linear Time Series Models

    A (discrete) time series [a (discrete) stochastic process] is a sequence of random numbers (or vectors) indexed by the integers:

    {y}y,y,… t001

    {y}y,y,… t112

    {y}…,y,y,y,… t-101

    The objective of time series analysis is to infer the characteristics of the stochastic process from a data sample (i.e., a partial realization of the process) and any additional information we have about the process.

    The characteristics of a sequence of random variables?

    ( Finite dimensional joint distributions


    ( Moments of the fidi’s (means, variances,


    In order to be able to use data to draw inferences about the stochastic process that generated these data, there has to be some characteristics of the stochastic process that remain stable over time: E.g., the mean, the variance …

A covariance stationary stochastic process

    is a stochastic process with the following properties

     E(y) = μ for all t t


     Var(y) = σ for all t t

     Cov(y,y) = γ for all t,s tt-ss

A strictly stationary stochastic process is a

    stochastic process whose fidi’s are time invariant


for all integers m,n,t,…,t and real numbers 1n

    α,…,α. 1n


( Although covariance and strict

    stationarity are not precisely the same

    conditions and neither one implies the

    other, for most practical purposes we can

    think of them interchangeably and, in

    particular, implying that the sequence of

    r.v.’s has a common mean, variance, and

    stable covariance structure.

    ( In theoretical time series work, strict

    stationarity turns out to be the more

    useful concept; in applied work

    covariance stationarity tends to be more


    ( In evaluating whether a data sample is

    drawn from a stationary process, we tend

    to look at whether the mean and variance

    appear to be fixed over time and whether

    the covariance structure appears stable

    over time. The most obvious failure of

    stationarity time trend in the mean.

    ( Although most time series models assume stationarity whereas many (most?) economic time series data appear to be nonstationary, there are often simple transformations of these series that can be plausibly assumed to be stationary: log transformations and, possibly, linear detrending or first-differencing.

If y is a stationary process then, according t

    to the Wold Decomposition Theorem, it

    has a moving average (MA) representation:





    ( the c’s are square summable i

    ( the ε’s are white noise t

    ( the ε’s are the innovations in the y’s t

If y has an MA(?) form and that MA is t

    invertible, then y has a finite-order t

    autoregressive representation (AR(p)):



    ( the ε’s are the innovations in the y’s

    ( the parameters a,…,a meet the 1p

    stationarity condition, i.e., the roots of

    the characteristic equation


    (1-az--az) = 0 1p

     are strictly greater than one in modulus.

    Linear autoregressions (or, linear stochastic difference equations) are straightforward to estimate and form a very useful model for univariate forecasting.

    While the univariate autoregression clearly accounts for the persistence in the (conditional mean of the) y’s, it does not account for the interdependence between the y’s and other time series.

    A simple and natural extension of the univariate autoregressive model is the vector autoregressive (VAR) model:

     Y = AY + …+ AY + ε t1t-1pt-pt


    ( Y is an n-dimensional jointly stationary t


    ( A,…,A are nxn matrices satifying the 1p

    stationarity condition. That is, the

    solutions to the determinantal equation


    det(I-Az--Az) = 0 1p

     exceed one in modulus

    ( ε is n-dimensional white noise with t

    covariance matrix Σ and is the εε

    innovation in Y. t

Report this document

For any questions or suggestions please email