Tuesday, 26 December 2006

st.statistics - E[log(Z_t^2)], proof of convergence with Law of Large Numbers

For every $t$, let $Y_t=log(Z_t^2)$. Fix some $t$. The sequence $(Y_{t-k})_{kgeqslant0}$ is i.i.d. with $E[Y_t]lt0$ hence the usual law of large numbers yields $frac1jsumlimits_{k=0}^{j-1}Y_{t-k}to E[Y_1]$. Fix some negative $mgt E[Y_1]$.



Then $frac1jsumlimits_{k=0}^{j-1}Y_{t-k}leqslant m$ for every $j$ large enough, that is, for every $jgeqslant J$ where $J$ is random and almost surely finite. In particular, for every $jgeqslant0$, $sumlimits_{k=0}^{j}Y_{t-k}leqslant mj+X$, for some almost surely finite random $X$. This implies the pointwise convergence of the series since
$$
sum_{jgeqslant0}expleft(sumlimits_{k=0}^{j}Y_{t-k}right)leqslantsum_{jgeqslant0}mathrm e^Xmathrm e^{mj}=mathrm e^X(1-mathrm e^m)^{-1}.
$$
Note that the RHS above is almost surely finite since $mlt0$ but is not (a priori) uniformly bounded since $X$ may be unbounded.

No comments:

Post a Comment