Time Series Analysis and Structural Change Detection

Time Series Analysis and Structural Change Detection

Kwok Pan Pang
DOI: 10.4018/978-1-60566-908-3.ch015
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Most research on time series analysis and forecasting is normally based on the assumption of no structural change, which implies that the mean and the variance of the parameter in the time series model are constant over time. However, when structural change occurs in the data, the time series analysis methods based on the assumption of no structural change will no longer be appropriate; and thus there emerges another approach to solving the problem of structural change. Almost all time series analysis or forecasting methods always assume that the structure is consistent and stable over time, and all available data will be used for the time series prediction and analysis. When any structural change occurs in the middle of time series data, any analysis result and forecasting drawn from full data set will be misleading. Structural change is quite common in the real world. In the study of a very large set of macroeconomic time series that represent the ‘fundamentals’ of the US economy, Stock and Watson (1996) has found evidence of structural instability in the majority of the series. Besides, ignoring structural change reduces the prediction accuracy. Persaran and Timmermann (2003), Hansen (2001) and Clement and Hendry (1998, 1999) showed that structural change is pervasive in time series data, ignoring structural breaks which often occur in time series significantly reduces the accuracy of the forecast, and results in misleading or wrong conclusions. This chapter mainly focuses on introducing the most common time series methods. The author highlights the problems when applying to most real situations with structural changes, briefly introduce some existing structural change methods, and demonstrate how to apply structural change detection in time series decomposition.
Chapter Preview
Top

Common Data Analysis Or Forecasting Methods

When introducing the following methods, we assume the time series data is presented as xt, t = 1, 2, 3,…., N, where xt is the observation of the variable x at time t, and N is number of observation.

ARIMA Model

ARIMA methodology proposed by Box Jenkin (1976) is one of the popular methods to uncover the hidden characteristics in the time series data, and to generate forecasts. The model is built based on the plot of the autocorrelation and partial autocorrelation functions of the dependent time series. The plot provides the information to determine which autoregressive or moving average component should be used in the model. Basically, ARIMA(p,q) model includes two independent processes. They are autoregressive process (AR(p)) and Moving average Process (MA(q)).

Autoregressive Process

Autoregressive Process can be represented as AR(p) that can be interpreted as a linear combination of prior observations, AR(p) can be summarized as:xt = ζ + φ1xt−1 + φ2xt−2 + … φkxtp  + ε(1) where p is order of autoregressive model, ζ is the constant of the model, and φ1, φ2, …, φp are the autoregressive model parameters.

3Moving Average Process

It can be represented as MA(q). In the Moving average model, the observation can be affected by its previous error, MA(q) can be written as:xt = μ + εt  − θ1εt−1  − θ2εt−2  − …  −  θ3εtq(2) where q is the order of the moving average model, μ is a constant and θ1,θ2,…,θq are the moving average model parameters

We suppose the stochastic process {xt : t = 0,  ±1,  ±2,…}, and its mean function is defined as μt = E(xt) for t = 0,  ±1,  ±2,…. In general, μt differs at different t. Autocorrelation is defined as:

978-1-60566-908-3.ch015.m01
(3) where Cov(xt, xs ) is autocovariance function,

Cov(xt, xs ) = E[(xt − μt)(xs − μs)] = E(xtxs) − μtμs

The partial autocorrelation at lag k (ϕkk) is defined as:ϕkk = Corr(xt, xt− k | xt−1, xt−2, …, xt− k+1) that can be interpreted as the correlation between xt and xt− k after removing the effect of the intervening variables xt− 1, xt− 2, …, xt− k=1

Complete Chapter List

Search this Book:
Reset