Isometries in Vector Spaces

Subjects: Metric and Normed Spaces, Linear Algebra
Links: Normal and Hermitian Operators, Adjoint Operators and Matrices

Def: Let V be a dimensional inner product space over a field F, and TL(V). If Tv=v for all vV, then T is an isometry. We call T a unitary operator if F=C and an orthogonal operator if F=R. In infinite dimensional vector spaces, if the operator that preserves norms, and is surjective, then it is an isometry.

Th: Let T be linear operator on a finite dimensional inner product space V. Then following statements are equivalent:

Cor: Let T be linear operator on a finite dimensional real inner product space V. Then V has an orthonormal basis of eigenvectors of T with corresponding eigenvalues of absolute value 1 iff T is orthogonal and self-adjoint.

Cor: Let T be linear operator on a finite dimensional complex inner product space V. Then V has an orthonormal basis of eigenvectors of T with corresponding eigenvalues of absolute value 1 iff T is unitary.

Partial Isometries

Def: Let V be a finite dimensional inner product space. A linear operator U on V is called a partial isometry if there exists a subspace W of V such that U(w)=w for all wW, and U(v)=0 for vW. This doesn’t imply that W is Uinvariant.

Def: A square matrix A is called on orthogonal matrix if AA=AA=I and unitary if AA=AA=I.

The set of all orthogonal transformations/matrices is called the Orthogonal Group, and the set of all unitary transformations/matrices is called the Unitary Group

Def: A and B are unitarily/orthogonally equivalent iff there exists a unitary/orthogonal matrix P such that A=PBP. This forms an equivalence relation on Mn(C)/Mn(R)

Th: Let A be a complex n×n. Then A is normal iff A is unitarily equivalent to a diagonal matrix.

Th: Let A be a real n×n. Then A is symmetric iff A is orthogonally equivalent to a diagonal matrix.

Th: A matrix that is both unitary and upper triangular then it is diagonal.

Schur’s Theorem Again

Let AMn(F) be a matrix whose characteristic polynomial splits over F.

Cor: Suppose V is a complex inner product space and TL(V). The the following are equivalent:

QR Factorization

Th: Let AMn×m(C) with linearly independent columns. Then A can be factorized as A=QR, where QMn×m(F) with orthogonal columns and RMm(F) upper triangleular matrix with positive diagonal entries.

We can consider the ith column of A as ai, then we can apply the Gram-Schmidt orthonormalization process to {ai}i=1m, the following vectors {qi}i=1m will form the matrix Q.

Similarly given the set of orthonormal vectors we can calculate the matrix R, as follows Rij=qi,aj for 1ijm, otherwise Rij=0.

If AMn(C), then Q will be a unitary/orthogonal matrix.

Given a linear system Ax=b when is invertible. We can decompose A=QR, and solve the system QRx=b, or Rx=Qb, this an be solved easily since R is upper triangleular.

Rigid Motions

Def: Let Vbe a real inner product space. A function f:VV is called a rigid motion if for all x,yV

f(x)f(y)=xy

Th: Let f:VV be a rigid motion on a finite dimensional real inner product space V. The there exists a unique orthogonal operator T on V and a unique translation g on V such that f=gT.

Th: Let f:VV be a rigid motion then f is invertible and f1 is also a rigid motion.

Cor: Th: Let f:VV be a rigid motion on a finite dimensional real inner product space V. The there exists a unique orthogonal operator T on V and a unique translation g on V such that f=Tg.

Orthogonal Operators on R2

Th: Let T be an orthogonal operator in R2. Then exactly one of the following conditions is satisfied:

Cor: Any rigid motion on R2 is either a rotation followed by a translation or a reflection about a line through the origin followed by a translation.

Householder Operators

Def: Let V be a finite dimensional complex/real inner product space, and let u be a unit vector in V. Define the Householder operator Hu:VV by Hu(x)=x2x,uu for all xV.

This type of operator can be used to calculate the QR factorization of n×m matrix, when nm.