next up previous
Next: Discussions Up: Auto-Regression Fractionally-Integrated Moving-Average Models Previous: Definitions

Minimization of Residuals

Minimization of Residuals

We define residuals by recurrent expressions:

      (42)
$\displaystyle \epsilon_1$ $\textstyle =$ $\displaystyle w_1$  
$\displaystyle \epsilon_2$ $\textstyle =$ $\displaystyle w_2-a_1 w_1 + b_1 \epsilon_1$  
    $\displaystyle ..........................................$  
$\displaystyle \epsilon_t$ $\textstyle =$ $\displaystyle w_t-a_1 w_{t-1} - ... -a_p w_{t-p} +
b_1 \epsilon_{t-1} + ... + b_q \epsilon_{t-q}.$  

Next the sum
$\displaystyle f(x)=\log f_m(x),\ \ f_m(x)= \sum_{t=1}^T \epsilon_t^2$     (43)

is minimized.

The logarithm is used to decrease the objective variation by improving the scales. The objective $f_m(x)$ depends on $m=p+q+1$ unknown parameters that are represented as an $m$-dimensional vector $x=(x_k, k= 1,...,m)=(a_i, i=1,...,p, b_j, j=1,...,q, d)$.

It is easy to see from (1.44), (1.39), and (1.37) that residuals $\epsilon_t$ are linear functions of the parameters $a_t$. This means that the minimum conditions

$\displaystyle {\partial f_m(x) \over \partial a_i} =0,\ i=1,...,p$     (44)

are given by a system of linear equations that defines the estimates of parameters $a_i=a_i(b, d)$ as a function of parameters $b_i, i=1,...,q, d$. It reduces the number of parameters of non-linear optimization to $n=q+1$.

The system

$\displaystyle {\partial f_m(x) \over \partial b_i} =0,\ i=1,...,q$     (45)

may have a multiple solution, because the residuals $\epsilon_t$ depend on $b_i$ as polynomials of degree $T-1$.

The equation

$\displaystyle {\partial f_m(x) \over \partial d} =0$     (46)

may also have multiple solutions, because the residuals depend on $d$ as a polynomial of degree $R$, where $R$ is a truncation parameter.

These imply that, in general, the objective $f_m(x)$ is a multi modal function of parameters $d$ and $b_i, i=1,...,q$ [*]. Therefore, one has to consider the methods of global optimization (see, [14]).

Denote

$\displaystyle f(x)= \log f_{q+1}(x)$     (47)

where
$\displaystyle f_{q+1}(x)= f_m(x),\ x_j=b_j, j=1,...,q,\ x_{j+1} =d,\ x_{j+1+ i} =a_i(b,d),\ i=1,...,p.$      

his means that by condition (1.46) we define those $x$- components that represent the parameters $a_i, i=1,...,p$.

There is no variance $\sigma^2$ in expressions (1.49) and (1.44). If necessary, we have to estimate the variance by some other well known techniques.


next up previous
Next: Discussions Up: Auto-Regression Fractionally-Integrated Moving-Average Models Previous: Definitions
mockus 2008-06-21