For simplicity, I work with $2n times 2n$ matrices, the odd by odd case is similar.
Summary: If $B$ is a skew symmetric matrix with eigenvalues $pm i theta_1$, $pm i theta_2$, ..., $pm i theta_n$ then the Jacobian matrix of the exponential near $B$ has eigenvalues
$$frac{1-e^{i(mp theta_j mp theta_k)}}{i(pm theta_j pm theta_k)}, quad 1 leq j < leq n$$
as well has having the eigenvalue $1$ with multiplicity $n$. (This is $4 binom{n}{2}+n=binom{2n}{2}$ eigenvalues in total.)
The determinant of the Jacobian matrix is thus
$$prod_{1 leq j < k leq n} frac{1-e^{i(mp theta_j mp theta_k)}}{i(pm theta_j pm theta_k)} = prod_{1 leq j < k leq n} frac{16 sin^2 frac{theta_j-theta_k}{2} sin^2 frac{theta_j+theta_k}{2}}{(theta_j^2-theta_k^2)^2}.$$
Take whatever sort of bound you have on $B$ and turn it into a lower bound on the above quantity. Notice that, if $theta_j + theta_k$ gets as large as $2 pi$, the above quantity is zero, so there is no nontrivial lower bound in that case.
A practical note on minimizing the above quantity: the log of the above is the sum of many terms of the form $f(phi) := log sin(phi) - log phi$, where $phi$ is a linear function. By the double derivative test, $f$ is concave. So the sum of many terms of the form $f(mbox{linear function})$ will be concave. This means that, on any convex region, the minimum will occur somewhere on the boundary.
Notation: We write $mathfrak{so}$ for the vector space of Skew symmetric matrices. We fix $B$ and $theta_j$ as above.
Explanation: Let me first point out why the question makes sense. The orthogonal matrices are a manifold, not a vector space, so one might be tempted to wonder whether it even makes sense to speak of a Jacobian; let alone to speak of the eigenvalues of the Jacobian matrix.
There are two ways to fix this, a naive way, and a sophisticated way, and they both give the same answer. The naive way is to point out that the orthogonal matrices are contained in the $n times n$ matrices. So we certainly have a $binom{2n}{2} times (2n)^2$ matrix, giving the Jacobian matrix of the exponential map as a map from skew-symmetric matrices to all matrices. This matrix is not square; its image is the tangent plane at $e^B$ to the space of orthogonal matrices. Explicitly, that tangent plane is $e^B mathfrak{so}$. We can rotate that tangent plane by the orthogonal matrix $e^{-B}$, giving us a map from $mathfrak{so}$ to itself; it is now sensible to discuss the eigenvalues of that map.
The sophisticated way is say that the Jacobian matrix is a map from $mathfrak{so}$ to the tangent space of $SO$ at $e^B$. But that tangent space is canonically identified with the tangent space of $SO$ at the identity, and the latter tangent space is $mathfrak{so}$.
Either way, we are being asked to consider the following map from $mathfrak{so}$ to itself:
$$E mapsto e^{-B} lim_{t to 0} (e^{B+tE} - e^B)/t. quad (*)$$
Let $A$ be the map $X mapsto [B,X]$ from $mathfrak{so}$ to itself. By the Baker-Campbell-Hausdorff formula, $(*)$ is
$$frac{1-e^{-A}}{A} E$$
where $(1-e^{-A})/A$ must be understood as the power series $1-A/2+A^2/6-cdots$. If written out as matrices, $A$ would be a $binom{2n}{2} times binom{2n}{2}$ matrix and $E$ would be a vector of length $binom{2n}{2}$.
Now, if the eigenvalues of $B$ are $pm i theta_j$, as above, then the eigenvalues of $A$ are $0$ with multiplicity $n$ and $i(pm theta_j pm theta_k)$. (Because the root system of type $D_n$ is ${ pm e_j pm e_k }$, or because it is an easy computation.) If $alpha_1$, $alpha_2$, ..., $alpha_N$ are the eigenvalues of $A$, then the eigenvalues of $(1-e^{-A})/A$ are $(1-e^{-alpha_j})/alpha_j$, where $(1-e^{-0})/0$ is interpreted as $1$.
Putting this all together, we get the above formulas.
No comments:
Post a Comment