Tom is right: the proof of Chebyshev's inequality can be easily adapted to every nondecreasing nonnegative function. The proof of this generalization that I prefer has a principle worth remembering:
First find an inequality between random variables, then integrate it.
To apply the principle, let $g$ denote a nondecreasing nonnegative function defined on $[0,+infty)$ and $Z$ a nonnegative random variable. Let $zge0$ such that $g(z)>0$. Then,
$$
g(z)mathbf{1}_Ale g(Z) mbox{with} A=[Zge z].
$$
Proof: if $omeganotin A$, the assertion reduces to $0le g(Z(omega))$, which holds because $g$ is nonnegative everywhere; if $omegain A$, the assertion reduces to $g(z)le g(Z(omega))$, which holds because $Z(omega)ge z$ and $g$ is nondecreasing.
Integrating the inequality yields
$$
g(z)P(A)le E(g(Z)),
$$
and, dividing both sides by $g(z)$, we are done.
The usual case is when $Z=|X-E(X)|$ and $g(z)=z$. The case mentioned by Thomas is when $Z=|X-E(X)|$ and $g(z)=z^p$, for every positive $p$ (and not only for $pge2$). Another case, mentioned by Tom and at the basis of the whole field of large deviations principles, is when $Z=mathrm{e}^{rX}$ for a nonnegative $r$ and $g(z)=z$. But to deal with the $ulog(u)$ case mentioned by Tom requires to be more careful because the function $umapsto ulog(u)$ is non monotonous on $[0,1]$ and not of a constant sign on $[0,+infty)$ (but everything works fine for $g(z)=zlog(z)$ on $[1,+infty)$, that is, if $Zge1$ almost surely).
No comments:
Post a Comment