Wednesday, February 8, 2012

Thoughts $O(n)$ Matrices

My sister recently came home from visiting her boyfriend's family in India, and she brought back some of the best tea Hyderabad had to offer.  Needless to say, I approve of their courtship.
Best Assam ever.  Image stolen from here
While I work on more fancy math shenanigans, let's look at my favorite groups of matrices, the orthogonal and unitary groups.

The orthogonal group of degree $n$, or $O(n)$, is a subgroup of the matrices over $\mathbb{R}^n$ for some $n$.  You can either define them as the matrices $A$ such that $AA^T = A^TA = I$, or the set of matrices whose rows (or columns) are an orthonormal basis of $\mathbb{R}^n$  (orthonormal means all the vectors are of length $1$ and perpendicular to each other).  Naturally, the first step comes in proving that these definitions are equivalent.
If the rows of $A$ are an orthonormal basis for $\mathbb{R}^n$, consider $AA^T = C$.  Denote the 'vector' in the $k$th row of $A$ as $\vec{a_k}$ and let $a_{ij}, a^T_{ij},$ and $c_{ij}$ denote the entries in the $i$th row and $j$th column of $A, A^T,$ and $C$ respectively.  Note that
$c_{ij} = \displaystyle \sum_{k=1}^n a_{ik}a^T_{kj} = \displaystyle \sum_{k=1}^n a_{ik}a_{jk} = \vec{a_i} \cdot \vec{a_j}$
That is, $c_{ij}$ is just the dot product of $\vec{a_i}$ and $\vec{a_j}$: 
$c_{ij} = \left\{ \begin{array}{ll} 1 & \mbox{when } i=j \mbox{ because } |\vec{a_i}| = 1 \\ 0 & \mbox{when } i \neq j \mbox{ because } \vec{a_i} \mbox{ and } \vec{a_j} \mbox{ are perpendicular} \end{array} \right. $ 
Therefore $C = I$ and $A^T = A^{-1}$.  Since the statements we used about the dot product are equivalent to the rows of $A$ being an orthonormal basis, you can prove that the rows (or columns) of $A$ are orthonormal from the assumption that $A^T = A^{-1}$, just by running the argument backwards through $AA^T = I$ (or $A^TA = I$ for the columns)
Now the orthogonal group clearly contains the identity matrix and inverses for all its elements.  The proof for the closure of the group (the product of two orthogonal matrices is also orthogonal) follows pretty naturally from the fact that converting two vectors from one orthonormal basis to another does not change the value of the dot product on those vectors.

Working through the details of that argument is left as an exercise for the reader.  I'm serious, do it if you haven't before.  It's good for your brain.

You can also talk about the set of matrices that satisfy $AA^T = I$ in $\mathbb{C}^n$.  Of course, because the dot product over complex vectors involves taking the complex conjugate of the second vector...

$\vec{v} \cdot \vec{w} = \displaystyle \sum_{i=1}^n v_i\overline{w_i}$

The more 'natural' group of matrices to study over $\mathbb{C}$ is the one defined by the relation $A\overline{A^T} =I$.
This is called the unitary group of degree $n$, and is denoted $U(n)$.  Since all our proofs so far have only used properties of the dot product, they're as applicable to $U(n)$ as they were to $O(n)$.

Every element of these groups corresponds in some way to a rotation or reflection of the canonical basis in $\mathbb{R}^n$ or $\mathbb{C}^n$.  Since these are the 'natural' symmetries of the space we live in, (and the direct generalizations thereof) these groups are a huge deal in the study of Manifolds, Algebraic Topology,  Differential Geometry, and probably a couple other branches of math I don't know about yet.  Both sets of matrices are also Lie Groups.

Expect a post about the neat topology of $O(n)$ sometime soon.

No comments:

Post a Comment