You are asking, I think, when a Central Limit Theorem holds. The simplest form of the CLT is that the binomial distributions Binomial(n,p), suitably rescaled, converge to a normal distribution as n goes to infinity. (This binomial case is usually not called the CLT, but goes under the name of the de Moivre-Laplace theorem.)
Now, a Binomial(n,p) random variable is the sum of n Bernoulli(p) random variables. The usual form of the CLT states that if $S_n = X_1 + ... + X_n$, where the $X_i$ are independent and identically distributed with mean μ and standard deviation σ, then $(S_n - mu n)/(sigma sqrt{n})$ converges in distribution to the standard normal as n → ∞.
If the $X_i$ are in fact dependent, see the link provided by Ori Gurel-Gurevich above.
If the $X_i$ are independent but not identically distributed, then there are two standard conditions for proving that the rescaled distribution of $S_n = X_1 + ... + X_n$ converges to the standard normal: Lindeberg's condition and Lyapunov's condition. Both are a bit difficult to understand when you first look at them. But the basic idea behind both of them is that if no one of the summands $X_i$ is too large (in variance) compared to the others, then the normal distribution still appears.
No comments:
Post a Comment