The purpose of this answer is to use the second moment method to make rigorous the heuristic argument of Michael Lugo. (Here is why his argument is only heuristic: If $N$ is a nonnegative integer random variable, such as the number of length-$r$ increasing consecutive sequences in a random permutation of ${1,2,ldots,n}$, knowing that $E[N] gg 1$ does not imply that $N$ is positive with high probability, because the large expectation could be caused by a rare event in which $N$ is very large.)
Theorem: The expected length of the longest increasing block in a random permutation of ${1,2,ldots,n}$ is $r_0(n) + O(1)$ as $n to infty$, where $r_0(n)$ is the smallest positive integer such that $r_0(n)!>n$. ("Block" means consecutive subsequence $a_{i+1},a_{i+2},ldots,a_{i+r}$ for some $i$ and $r$, with no conditions on the relative sizes of the $a_i$.)
Note: As Michael pointed out, $r_0(n)$ is of order $(log n)/(log log n)$.
Proof of theorem: Let $P_r$ be the probability that there exists an increasing block of length at least $r$. The expected length of the longest increasing block is then $sum_{r ge 0} r(P_r-P_{r+1}) = sum_{r ge 1} P_r$. We will bound the latter sum from above and below.
Upper bound: The probability $P_r$ is at most the expected number of increasing blocks of length $r$, which is exactly $(n-r+1)/r!$, since for each of the $n-r+1$ values of $i$ in ${0,ldots,n-r}$ the probability that $a_{i+1},ldots,a_{i+r}$ are in increasing order is $1/r!$. Thus $P_r le n/r!$. By comparison with a geometric series with ratio $2$, we have $sum_{r > r_0(n)} P_r le P_{r_0(n)} le 1$. On the other hand $sum_{1 le r le r_0(n)} P_r le sum_{1 le r le r_0(n)} 1 le r_0(n)$, so $sum_{r ge 1} P_r le r_0(n) + 1$.
Lower bound: Here we need the second moment method. For $i in {1,ldots,n-r}$, let $Z_i$ be $1$ if $a_{i+1}<a_{i+2}<ldots<a_{i+r}$ and $a_i>a_{i+1}$, and $0$ otherwise. (The added condition $a_i>a_{i+1}$ is a trick to reduce the positive correlation between nearby $Z_i$.) The probability that $a_{i+1}<a_{i+2}<ldots<a_{i+r}$ is $1/r!$, and the probability that this holds while $a_i>a_{i+1}$ fails is $1/(r+1)!$, so $E[Z_i]=1/r! - 1/(r+1)!$. Let $Z=sum_{i=1}^{n-r} Z_i$, so
$$E[Z]=(n-r)(1/r! - 1/(r+1)!).$$
Next we compute the second moment $E[Z^2]$ by expanding $Z^2$. If $i=j$, then $E[Z_i Z_j] = E[Z_i] = 1/r! - 1/(r+1)!$; summing this over $i$ gives less than or equal to $n/r!$. If $0<|i-j|<r$, then $E[Z_i Z_j]=0$ since the inequalities are incompatible. If $|i-j|=r$, then $E[Z_i Z_j] le 1/r!^2$ (the latter is the probability if we drop the added condition in the definition of $Z_i$ and $Z_j$). If $|i-j|>r$, then $Z_i$ and $Z_j$ are independent, so $E[Z_i Z_j] = (1/r! - 1/(r+1)!)^2$. Summing over all $i$ and $j$ shows that
$$E[Z^2] le frac{n}{r!} + E[Z]^2 le left(1 + O(r!/n) right) E[Z]^2.$$
The second moment method gives the second inequality in
$$P_r ge operatorname{Prob}(Z ge 1) ge frac{E[Z]^2}{E[Z^2]} ge 1 - O(r!/n).$$
If $r le r_0(n)-2$, then $r! le (r_0(n)-1)!/(r_0(n)-1)$, so $r!/n le 1/(r_0(n)-1)$, so $P_r ge 1 - O(1/r_0(n))$. Thus
$$sum_{r ge 1} P_r ge sum_{r=1}^{r_0(n)-2} left( 1 - O(1/r_0(n)) right) = r_0(n) - O(1).$$
No comments:
Post a Comment