Friday, 21 June 2013

mathematical modeling - Which way is the right way to calculate auto correlation function

I found the following paper, which deals specifically with artifacts when autocorrelation should be determined from a set of several finite/short time courses instead of one long time course:



"Some Effects of Finite Sample Size and Persistence on Meteorological Statistics. Part I: Autocorrelations" Kevin E. Trenberth



http://www.cgd.ucar.edu/staff/trenbert/trenberth.papers/i1520-0493-112-12-2369.pdf



It is pretty straightforward in terms of the artifacts from short sampling times, and also how to get rid of them. The paper does not have much higher math in it (which I like, others maybe not), but it is widely cited. I found it more obvious how to correct that artifacts from this publication than some other, "proper math" papers.



In essence, the problems are brought about by the fact that the time series can be too short to contain the full range of fluctuations as well as the full persistence of these fluctuations. It leads to an underestimation of the autocorrelation function.



For a set of $N$ time courses $x_n(i)$, ($n=1,dots,N$, $i=1,dots,I$), that supposedly stem from the same process, you first calculate the whole sample mean,
$overline{x} = frac{1}{NI}{sum_{n=1}^N left( sum_{i=1}^Ix_n(i) right)}$.



Next, you calculate the autocorrelation not normalized autocorrelation function, for every individual time course $n$ and lag number $k$ ($0leq kleq I$), $tilde{a}_n(k) = langle (x_n(j))(x_n(j+k)) rangle_{j=1,dots,I-k}$.



The normalized autocorrelation of each individual sample $x_n$ for a lag of $k$ now is $a_n(k) = tilde{a}_n(k)/tilde{a}_n(0)$.



Finally, the corrected autocorrelation function of the process that was repeatedly sampled in the different time courses $x_i$ is the average of all individual autocorrelation functions, $a(k) = langle a_n(k) rangle_{n=1,dots,N}$.



It might not help the person who wrote the original question, but others searching for the same thing.

No comments:

Post a Comment