Wednesday, 1 June 2016

co.combinatorics - Linear Algebra Proofs in Combinatorics?

Here are some examples where the dimension of a vector space of polynomials is used to solve a combinatorial problem.



Theorem 1 There are at most $n(n+1)/2$ equiangular lines in $mathbb{R}^n$.



Proof. [Koornwinder] Let $L_1, dots, L_m$ be $m$ lines passing through origin in $mathbb{R}^n$ with angle $arccos(alpha)$ between every pair of them. Pick unit vectors $u_1, dots, u_m$ on each line. Then we have $langle u_i, u_j rangle ^2 = alpha^2 delta_{i, j}$. Define polynomials $P_1, dots, P_m$ with $P_i(x) = langle u_i, x rangle^2 - alpha^2 langle x, x rangle$. Then we have $P_i(u_j) = (1-alpha^2)delta_{i, j}$, and therefore, these $m$ polynomials are linearly independent. The space of $n$-variable homogenous polynomials with degree at most $2$ has dimension ${n + 1choose 2}$, and therefore $m leq n(n+1)/2$.



Theorem 2 [Larman, Rogers, Seidel] A two-distance set in $mathbb R^n$ has cardinality at most $(n+4)(n+1)/2$.



Proof. Let $u_1, dots, u_m$ be $m$ points in $R^n$ and $a$, $b$ be the two non-zero real number such that $|u_i - u_j| in {a, b}$. Define $P_i(x) = (|u_i - x|^2 - a^2)(|u_i - x|^2 - b^2)$. Then, $P_i(u_j) = a^2b^2delta_{i, j}$. Therefore, the polynomials $P_i$'s are linearly independent. Moreover, these polynomials lie in the vector space spanned by polynomials of the type $$left(sum_{i = 1}^nx_i^2right)^2, left(sum_{i = 1}^n x_i^2right)x_j, x_ix_j, x_i, 1.$$
The number of such polynomials is $1 + n + n(n+1)/2 + n + 1 = (n+4)(n+1)/2$, which gives us the bound.



For more such examples see "Linear Algebra Methods in Combinatorics" by Babai and Frankl, linked in Stanley's answer.

No comments:

Post a Comment