Let be characteristic field. We are going to drop, the fact that the codomain is for the rest of the note, since this notation is a bit cumbersome.
Def: A covariant -tensor on is said to be alternating if its value changes sign whenever two arguments are interchanged, or equivalently if any permutation of the arguments causes its value by the sign of the permutation. Alternating covariant -tensors are also called exterior forms, multicovectors or -covectors. The space of all -covectors on is denoted by . All -tensors and -tensors are alternating.
Lemma: Let be a covariant -tensor on a finite-dimensional vector space . The following are equivalent.
is alternating.
whenever the -tuple is linearly independent.
gives the value zero whenever two of its arguments are equal: $$\alpha(v_1,\dots, w,\dots, w, \dots v_k) = 0.$$
Def: We define a projection , called alternation as follows: $$\text{Alt } \alpha = \frac1{k!} \sum_{\sigma \in S_k} \text{sgn }\sigma (\sigma \alpha),$$where is the symmetric group on elements. More explicitly, this means $$(\text{Alt } \alpha)(v_1,\dots, v_k) = \frac1{k!} \sum_{\sigma \in S_k} \text{sgn }\sigma \alpha(v_{\sigma(1)},\dots, v_{\sigma(k)}). $$ Properties of Alternation: Let be a covariant tensor on a finite-dimension -vector space.
is alternating.
iff is alternating.
Elementary Alternating Tensors
Def: Given a positive integer , an ordered -tuple of positive integers is called a multi-index of length . If is such a multi-index and , we write for the following multi-index: $$I_\sigma := (i_{\sigma(1)},\dots, i_{\sigma(k)}) $$Note that for .
Def: Let be a finite-dimensional -vector space, and suppose is any basis for . We know define a collection of -covectors on that generalise the determinant function on . For each multi-index of length such that , define a covariant -tensor by $$\varepsilon^I(v_1,\dots, v_k) = \det \begin{pmatrix}
\varepsilon^{i_1}(v_1)& \cdots & \varepsilon^{i_1}(v_k) \
\vdots & \ddots & \vdots \ \varepsilon^{i_k}(v_1) & \cdots & \varepsilon^{i_k}(v_k)
\end{pmatrix} = \det \begin{pmatrix}
v_1^{i_1}& \cdots & v_k^{i_1} \
\vdots & \ddots & \vdots \ v_1^{i_k} & \cdots & v_k^{i_k}
\end{pmatrix}.$$In other words, if is denotes the matrix whose columns are the components of the vectors with respect to the basis dual to , then is the determinant of the submatrix consisting of the rows of . We call an elementary alternating tensor or elementary -covector.
In order to streamline the computations with the elementary -covectors, we can extend the Kronecker delta notation in the following way. If and are multi-indices of length , we define$$\delta^I_J := \det \begin{pmatrix}
\delta_{j_1}^{i_1}& \cdots & \delta_{j_k}^{i_1} \
\vdots & \ddots & \vdots \ \delta_{j_1}^{i_k} & \cdots & \delta_{j_k}^{i_k}
\end{pmatrix} .$$ Obs: We see that $$\delta^I_J = \begin{cases}\text{sgn }\sigma & \text{if neither } I \text{ nor } J \text{ has repeated index and } J = I_\sigma \text{ for some }\sigma\in S_k, \
0 & \text{if }I \text{ or }J \text{ has a repeated index or } J \text{ is not a permutation of }I.
\end{cases} $$ Properties of Elementary -covectors: Let be a basis for and let be the dual basis for , and let be a multi-index.
If has a repeated index, then .
If , for some , then
The result of evaluating on a sequence of basis vectors is $$\varepsilon^I(E_{j_1},\dots, E_{j_k}) = \delta^I_J. $$ Def: A multi-index is said to be increasing if .
A Basis for : Let be a -dimensional -vector space. If is any basis for , then for each positive , the collection of -covectors $$\mathcal E:= {\varepsilon^I\mid I \text{ is an increasing multi-index of length }k} $$is a basis for . Therefore $$\dim {\displaystyle \bigwedge}^{!k}(V^*) = {n\choose k}. $$If , then .
Prop: Suppose is an -dimensional -vector space and . If is any linear map and are arbitrary vectors in , then $$\omega(Tv_1,\dots, Tv_n) = (\det T) \omega(v_1,\dots, v_n). $$
The Wedge Product
Determinant Convention: Given and , we define their wedge product or exterior product to be the following -covector: $$\omega \wedge \eta := \frac{(k+l)!}{k!l!}\text{Alt }(\omega\otimes \eta).$$ Alt Convention: Given and , we define their wedge product or exterior product to be the following -covector: $$\omega \wedge \eta := \text{Alt }(\omega\otimes \eta).$$
Lemma: Let be a -dimensional -vector space and let be a basis for . For any multi-indices and , $$\varepsilon^I \wedge \varepsilon^J = \varepsilon^{IJ}, $$where is obtained by concatenating and . (under the determinant convention). $$\varepsilon^I \wedge \varepsilon^J =\frac{k!l!}{(k+l)!} \varepsilon^{IJ}, $$under the Alt convention.
Properties of the Wedge Product: Suppose , , , and are multicovectors on a finite-dimensional -vector space .
If is any basis for and is any multi-index, then $$\varepsilon^{i_1}\wedge\dots \wedge\varepsilon^{i_k} = \varepsilon^I. $$
For any covectors and vectors , $$\omega^1\wedge\dots \wedge \omega^k(v_1,\dots, v_k) = \det(\omega^j(v_i)), $$under the determinant convention. We also have $$\omega^1\wedge\dots \wedge \omega^k(v_1,\dots, v_k) =\frac{1}{k!} \det(\omega^j(v_i)). $$ Def: A -covector is decomposable if it can be expressed in the form , where are covectors.
Def: For any -dimensional -vector space , define a vector space by $$\bigwedge(V^*) := \bigoplus_{k = 0}^n {\textstyle \bigwedge}^{!k} (V).$$We see that is a -vector space with dimension .
Obs: We see that is an anticommutative graded algebra, and it is called the exterior algebra or Grassmann algebra of .
Interior Multiplication
Def: Let be a finite-dimensional -vector space. For each , we define a linear map , called interior multiplication by , as follows: $$i_v\omega(w_1,\dots, w_{k-1}) := \omega(v, \omega_1,\dots, w_{k-1}). $$In other words, is obtained from by inserting into the first slot. By convention, we interpret to be zero when is a -covector. Another common notation is $$v;\lrcorner ;\omega := i_v \omega .$$This is often read as ' into '.
Lemma: Let be a finite-dimensional -vector space and .
.
If and , $$i_v(\omega\wedge\eta) = (i_v \omega)\wedge \eta + (-1)^k\omega\wedge(i_v \eta). $$
when the wedge product is defined using the Alt convention, interior multiplication of a vector with a -form has to be defined with an extra : $$\bar i_v \omega(w_1,\dots, w_{k-1}) = k\omega(v, w_1,\dots, w_{k-1}). $$This definition ensures that the interior multiplication still satisfies $$\bar i_v(\omega\wedge\eta) = (\bar i_v \omega)\wedge \eta + (-1)^k\omega\wedge(\bar i_v \eta). $$
Duality
Let be a space with inner product.
For each natural , we can define an interior product on . Let , then $$\langle \omega^1\wedge\dots \wedge\omega^k,\eta^1\wedge\dots\wedge\eta^k\rangle := \det(\langle(\omega^i)^\sharp, (\eta^j)^\sharp\rangle). $$So basically, we are calculating pulling the covectors in the original vector space using the musical isomorphisms, and then calculating the inner product there.