Def: Let be a vector space over a field . A function is called a bilinear form on if is linear in each variable when the other variable is held fixed. That is
for all and .
for all and .
The set of all bilinear forms on is denoted by or , not to confuse with .
Properties for any bilinear form
For any , the functions are defined by$$ L_x(y) = H(x,y) \quad \text{ and }\quad R_x(y)= H(y,x) $$Then .
for all .
If , such that then is a bilinear form, in particular we say that and it’s said to be ’s transpose.
Def: Let a vector space, let and be bilinear forms on , and let be a scalar. We define the sum and the scalar product by the equations
and $$(\lambda H_1)(x, y) = \lambda( H_1(x, y)) $$ Def: For any vector space , the sum of two bilinear forms and the product of a scalar and a bilinear form on are again bilinear forms on . Furthermore, is a vector space with respect to this operations.
Def: Let be an ordered basis for an dimensional vector space , and let . We can associate with an matrix whose entry in row and column is defined by $$ A_{ij} = H(v_i, v_j) $$
This matrix is called the matrix representation of with respect to the ordered basis and is denoted as .
Th: For any dimensional vector space over and any ordered basis for , is an isomorphism.
Cor: For any dimensional vector space , has dimension .
Cor: Let be an dimensional vectors over with ordered basis . If and , then iff , where is the isomorphism of onto with respect to to the ordered basis .
Cor: Let be a field, , and be the standard basis for . Then for any there exists a unique matrix , namely, , such that $$H(x,y) = x^\top Ay$$ Def: Let . Then is said to be congruent to if there exists an invertible matrix such that .
This is an equivalence relation on .
Th: Let be a finite dimensional vector space with ordered bases and , and let be the change of coordinate matrix changing coordinates and coordinates. Then for any , we have . Thus and are congruent.
Cor: Let be an dimensional vector space with ordered basis , and let . For any matrix , if is congruent to , then there exists an ordered basis for such that . In this case, if is a matrix such that , then changes coordinates into coordinates.
We can think of elements from as linear functions form to . Thus is isomorphic to . In the case that is finite dimensional, then there’s a canonical isomorphism between and .
Symmetric Bilinear Forms
Def: A bilinear form on a vector space is symmetric if for all .
Th: Let be a bilinear form on a finite dimensional vector space , and let be an ordered basis for . Then is symmetric iff is symmetric.
Def: A bilinear form on a finite dimensional vector space is called diagonalizable if there is an ordered basis for such that is a diagonal matrix.
Cor: Let be a diagonalizable bilinear form on a finite dimensional vector space . Then is symmetric.
Lemma: Let be a nonzero symmetric bilinear form on a vector space over a field not of characteristic . Then there is a vector such that .
Th: Let be a finite dimensional vector space over a field not of characteristic . Then every symmetric bilinear form on is diagonalizable.
Cor: Let be a field that is not of characteristic . If is a symmetric matrix, then is congruent to a diagonal matrix.
Def: A bilinear form such that for all , , then is called skew-symmetric or antisymmetric.
Th: If is vector space over the field not with characteristic . Then one can decompose a bilinear form into a symmetric and skew-symmetric parts $$ B^+ = \frac{1}{2}(B + {^\top B}) \qquad B^- = \frac{1}{2}(B-{^\top B}) $$ Def: A vector space endowed with a non-degenerate symmetric bilinear form is said to be a quadratic space. Similarly, a vector space endowed with a non-degenerate antisymmetric bilinear form is called a symplectic space.
Def: Let be a non-degenerate
Diagonalizing Symmtric Matrices
Let be an matrix with entries from a field not of characeristic . There are matrices such that is invertible, and diagonal such that .
If is an elementary matrix, can be obtained by performing an elementary column operation on . can be obtained by performing the same operation on the rows of rather than the column. Thus can be obtained by first applying the operations into the columns and then to the rows of . Suppose that is an invertible matrix and a diangonal matrix such that , then is a product of invertible matrices say , we get that $$ D =Q^\top AQ = E_p ^\top \cdots E_2^\top E_1^\top AE_1 E_2\cdots E_p $$
Getting a diagonalization of using only elementary operations.
Sylvester’s Law of Inertia
Let be a symmetric bilinear form on a finite dimensional real vector space . Then the number of positive diagonal entries and the number of negative diagonal entries in any diagonal matrix representation of are each independent of the diagonal representation.
Def: The number of positive diagonal entries in a diagonal representation of a symmetric bilinear form on a real vector space is called the index of the form. The difference between the number of positive and the number of negative diagonal entries in a diagonal representation of a symmetric bilinear form is called the signature of the form. The three terms rank, index and signature are called the invariants of the bilinear form because they are invariant with respect to matrix representations. These same terms apply to the associated quadratic form. Notice that the values of any two of these invariants determine the value of the third.
Sylvester’s Law of Inertia for Matrices
Let be real symmetric matrix. Then the number of positive diagonal entries and the number of negative entries and the number of diagonal entries in any diagonal congruent to is independent of choice of the diagonal matrix.
Def: Let be real symmetric matrix, and let be a diagonal matrix that is congruent to . The number of positive diagonal entries of is called the index of . The difference between the number of positive diagonal entries and the number of negative diagonal entries of is called the signature of . As before, the rank, index, and signature of a matrix are called the invariants of the matrix, and the values of any two of these invariants determine the value of the third.
Cor: Two real symmetric matrices are congruent iff they have the same invariants.
Cor: A real symmetric matrix has index and rank a iff is congruent to , where is defined as $$ J_{pr} := \begin{pmatrix} I_p & O & O \ O & -I_{r-p} & O \ O& O& O \end{pmatrix} $$