Bilinear Forms

Subjects: Linear Algebra, Clifford Algebra
Links: Dual Vector Spaces, Tensor Product of Modules

Def: Let V be a vector space over a field F. A function H:V×VF is called a bilinear form on V if H is linear in each variable when the other variable is held fixed. That is

The set of all bilinear forms on V is denoted by B(V) or T2(V), not to confuse with T2V=V2.

Def: Let Va vector space, let H1 and H2 be bilinear forms on V, and let λ be a scalar. We define the sum H1+H2 and the scalar product λH1 by the equations

(H1+H2)(x,y)=H1(x,y)+H2(x,y)

and $$(\lambda H_1)(x, y) = \lambda( H_1(x, y)) $$
Def: For any vector space V, the sum of two bilinear forms and the product of a scalar and a bilinear form on V are again bilinear forms on V. Furthermore, B(V) is a vector space with respect to this operations.

Def: Let β={vi}i=1n be an ordered basis for an ndimensional vector space V, and let HB(V). We can associate with H an n×n matrix A whose entry in row i and column j is defined by $$ A_{ij} = H(v_i, v_j) $$
This matrix A is called the matrix representation of H with respect to the ordered basis β and is denoted as ψβ(H).

Th: For any ndimensional vector space Vover F and any ordered basis β for V, ψβ:B(V)Mn(F) is an isomorphism.

Cor: For any ndimensional vector space V, B(V) has dimension n2.

Cor: Let V be an ndimensional vectors over F with ordered basis β. If HB(V) and AMn(F), then ψβ(H)=A iff H(x,y)=[ϕβ(x)]A[ϕβ(y)], where ϕβ is the isomorphism of V onto Fn with respect to to the ordered basis β.

Cor: Let F be a field, nN+, and β be the standard basis for Fn. Then for any HB(Fn), there exists a unique matrix AMn(F), namely, A=ψβ(H), such that $$H(x,y) = x^\top Ay$$
Def: Let A,BMn(F). Then B is said to be congruent to A if there exists an invertible matrix QMn(F) such that B=QAQ.

This is an equivalence relation on Mn(F).

Th: Let V be a finite dimensional vector space with ordered bases β={vi}i=1n and γ={wi}i=1n, and let Q be the change of coordinate matrix changing γcoordinates and βcoordinates. Then for any HB(V), we have ψγ(H)=Qψβ(H)Q. Thus ψγ(H) and ψβ(H) are congruent.

Cor: Let V be an ndimensional vector space with ordered basis β, and let HB(V). For any n×n matrix B, if B is congruent to ψβ(H), then there exists an ordered basis γ for V such that ψγ(H)=B. In this case, if Q is a matrix such that B=Qψβ(H)Q, then Q changes γcoordinates into βcoordinates.

We can think of elements from B(V) as linear functions form VV to F. Thus B(V) is isomorphic to (VV). In the case that V is finite dimensional, then there’s a canonical isomorphism between (VV) and VV.

Symmetric Bilinear Forms

Def: A bilinear form H on a vector space V is symmetric if H(x,y)=H(y,x) for all x,yV.

Th: Let H be a bilinear form on a finite dimensional vector space V, and let β be an ordered basis for V. Then H is symmetric iff ψβ(H) is symmetric.

Def: A bilinear form H on a finite dimensional vector space V is called diagonalizable if there is an ordered basis β for V such that ψβ(H) is a diagonal matrix.

Cor: Let H be a diagonalizable bilinear form on a finite dimensional vector space V. Then H is symmetric.

Lemma: Let H be a nonzero symmetric bilinear form on a vector space V over a field F not of characteristic 2. Then there is a vector xV such that H(x,x)0.

Th: Let V be a finite dimensional vector space over a field F not of characteristic 2. Then every symmetric bilinear form on V is diagonalizable.

Cor: Let F be a field that is not of characteristic 2. If AMn(F) is a symmetric matrix, then A is congruent to a diagonal matrix.

Def: A bilinear form H such that for all x,yV, H(x,y)=H(y,x), then is called skew-symmetric or antisymmetric.

Th: If V is vector space over the field F not with characteristic 2. Then one can decompose a bilinear form into a symmetric and skew-symmetric parts $$ B^+ = \frac{1}{2}(B + {^\top B}) \qquad B^- = \frac{1}{2}(B-{^\top B}) $$
Def: A vector space endowed with a non-degenerate symmetric bilinear form is said to be a quadratic space. Similarly, a vector space endowed with a non-degenerate antisymmetric bilinear form is called a symplectic space.

Def: Let H be a non-degenerate

Diagonalizing Symmtric Matrices

Let A be an n×n matrix with entries from a field F not of characeristic 2. There are matrices Q,DMn(F) such that Q is invertible, and D diagonal such that QAQ=D.

If E is an elementary matrix, AE can be obtained by performing an elementary column operation on A. EA can be obtained by performing the same operation on the rows of A rather than the column. Thus EAE can be obtained by first applying the operations into the columns and then to the rows of AE. Suppose that Q is an invertible matrix and D a diangonal matrix such that QAQ, then Q is a product of invertible matrices say Q=E1E2Ep, we get that $$ D =Q^\top AQ = E_p ^\top \cdots E_2^\top E_1^\top AE_1 E_2\cdots E_p $$
Getting a diagonalization of A using only elementary operations.

Sylvester’s Law of Inertia

Let H be a symmetric bilinear form on a finite dimensional real vector space V. Then the number of positive diagonal entries and the number of negative diagonal entries in any diagonal matrix representation of H are each independent of the diagonal representation.

Def: The number of positive diagonal entries in a diagonal representation of a symmetric bilinear form on a real vector space is called the index of the form. The difference between the number of positive and the number of negative diagonal entries in a diagonal representation of a symmetric bilinear form is called the signature of the form. The three terms rank, index and signature are called the invariants of the bilinear form because they are invariant with respect to matrix representations. These same terms apply to the associated quadratic form. Notice that the values of any two of these invariants determine the value of the third.

Sylvester’s Law of Inertia for Matrices

Let A be real symmetric matrix. Then the number of positive diagonal entries and the number of negative entries and the number of diagonal entries in any diagonal congruent to A is independent of choice of the diagonal matrix.

Def: Let A be real symmetric matrix, and let D be a diagonal matrix that is congruent to A. The number of positive diagonal entries of D is called the index of A. The difference between the number of positive diagonal entries and the number of negative diagonal entries of D is called the signature of A. As before, the rank, index, and signature of a matrix are called the invariants of the matrix, and the values of any two of these invariants determine the value of the third.

Cor: Two real symmetric n×n matrices are congruent iff they have the same invariants.

Cor: A real symmetric n×n matrix A has index p and rank a r iff A is congruent to Jpr, where Jpr is defined as $$ J_{pr} := \begin{pmatrix} I_p & O & O \ O & -I_{r-p} & O \ O& O& O \end{pmatrix} $$