next up previous
Next: Simple Example Up: Structural Stabilization Previous: Structural Stabilization

Stabilization of Structures of Time Series

The traditional time series models assume that their parameters do not change. Examples are parameters $a,b$ of the ARMA model, parameters $a,b,d$ of the ARFIMA model, parameters $a,b,c$ of the BL model, and parameters $a,\beta, \sigma,\mu$ of the ANN model.

The variability of parameters of models describing the financial data such as stock rates and the currency exchange rates can be partially explained by the changes of economical conditions. However this variability is there even in the stable economical conditions. The reason is the "Feed-Back" processes. It is well known that the predictions may influence the supply and demand and consequently the future data.. In such cases the statistical best fit models should be complemented or replaced by the game-theoretical equilibrium models. However there is the possibility that the equilibrium model would be namely the Wiener process. That means that the market rates in economics are governed by the similar model as the Brownian motion in physics and are as unpredictable as the movement of individual molecules in gases. Table 1.1 shows that the simplest Wiener process, called as the Random Walk (RW), predicts the financial data at least as well as Arma models.

The objective of traditional time series models is to define such parameters that minimize a deviation from the available data. One may call them as the best fit models. The goodness of fit is described by continuous parameters $C$ called as state variables. For example, in the ARMA model (see expression (1.1)) the state variables $C=(a_i,\ i=1,...,p,\ b_j,\ j=1,...,q)$.

If the parameters remain constant in the future, then models that fit best to the past data will predict the future data as well. Otherwise, the best fit to the past data can be irrelevant or even harmful for predictions.

Models are needed which are not sensitive to the changes of system parameters. Such models may predict the uncertain future better by eliminating the nuisance parts from the structure of the model.

Trying to solve this problem we introduce a notion of the model structure. The model structure is determined by the Boolean parameters $S$ called as structural variables. A structural variable is equal to unit if the corresponding component of time series model is included . Otherwise, the structural variable is equal to zero. For example, in the ARMA model $S=(s_i^a,\ i=1,...,p,\ s_j^b,\ j=1,...,q)$. Here $s_i^a=1$, if the parameter $a_i$ is included into the ARMA model and $s_i^a=0$, otherwise[*]. We search for such structure $S$ of the model that minimizes the prediction errors in the changing environment.

To achieve this we divide available data $W=(w_t, \ t=1,...,T)$ into two parts $W_0=(w_t,\ t=1,....T_0)$ and $W_1=(w_t\ t=T_0+1,...,T)$.

The first part $W_0$ is for estimation of continuous parameters $C=C(S)$ which depends on Boolean structural parameters $S$. The estimation is performed for a set of all feasible $S$ by minimizing the least square deviation.

The second part $W_1$ is used to select such $S$ that minimize the least square deviation. This means that the second part $W_1$ is for estimation of Boolean structural parameters.

Denote by $R_t(S,C,W)$ the predicted value of a model $R$ with fixed parameters $S,C$ using the data $(w_1,...,w_{t-1})\subset W$. The difference between the prediction and the actual data $w_t$ is denoted by $\epsilon_t(S,C,W)=w_t-R_t(S,C,W)$ . Denote by $C_0(S)$ the fitting parameters $C$ which minimize the sum of squared deviations $\Delta_{0,0}(C,S)$ using the first data set $W_0$ at fixed structure parameters $S$.

$\displaystyle \Delta_{0,0}(C,S)= \sum_{t=1}^{T_0} \epsilon_t^2(S,C,W),$     (48)
$\displaystyle C_{0}(S)= \arg \min_C \Delta_{0,0}(C,S).$     (49)

We stabilize the structure $S$ by minimizing the sum of squared deviations $\Delta_{1,0}(S)$ using the second data set $W_1$ and the fitting parameters $C_{0}(S)$ that were obtained from the first data set
$\displaystyle \Delta_{1,0}(S)= \sum_{t=T_0+1}^T \epsilon_t^2(S,C_0(S),W),$     (50)
$\displaystyle S_1= \arg \min_S \Delta_{1,0}(S).$     (51)

This way a trade-off is reached between the fitting parameters and the structural ones. The fitting parameters $C_0(S)$ provide the best fit to the first data set $W_0$ at fixed structure $S$ . One stabilizes the structure $S$ by minimizing the prediction errors for the second set of data $W_1$ while using the fitting parameters $C_0(S)$ obtained from the first set of data $W_0$ . This way the stabilized structure $S=S_1$ of $R$ is obtained by eliminating unstable[*] parameters and parts of the time series.

We consider two data sets $W_0$ and $W_1$ just for simplicity. One may partition the data $W$ into many data subsets $w_t \in W_k,\ t \in T_k,\ k=1,...,K, \
\cup_k W_k=W,\ \cup_k T_k = T$. In this case we minimize the sum

$\displaystyle S_{K}=\arg \min_S \Delta(S),$     (52)

where
$\displaystyle \Delta(S)= \sum _{k,l} \Delta_{k,l}(S).$     (53)

Here
$\displaystyle \Delta_{k,l}(S)= \sum_{t \in T_k} \epsilon_t^2(S,C_l(S),W).$     (54)

Note, that in this case dividing the data $W$ into many parts one may obtain sequences $T_k$ to short for the meaningful estimation of parameters $C_k$ if $T$ is not very large. If $T$ is very large one may expect that most of the fitting parameters $C_k$ would be different in different data subsets $W_k$ and will be eliminated by the stabilization procedure (1.54). Therefore number $K=2$ seems reasonable as the first stabilization attempt. One may try $K >2$ later on.

The usefulness of the structural stabilization follows from the observation that any optimal estimate of time series parameters using a part $W_0$ of the data $W=W_0 \cup W_1$ is optimal for another part $W_1$ only if all the parameters remain the same. Otherwise one may obtain a better estimate eliminating the changing parameters from the model. For example in the case of changing parameters $(a_i,\ i=2,...,p,\ b_j,\ j=1,...,q)$ of the ARMA model the best prediction may be obtained by elimination of all the parameters except $a_1=1$ (see Table 1.1).


next up previous
Next: Simple Example Up: Structural Stabilization Previous: Structural Stabilization
mockus 2008-06-21