We define the expected value of a random vector composed of two random variable with finite expected values, as the vector of the expected values as
Similarly for dimensional random vector.
We can define the covariance of two vectors, where how much the each change with respect to the other and it is defined as
doing some algebra we can see that
and that
It has a few properties:
Let , then
This properties are actually enough to tell us that the covariance is a bilinear operator
If and are independent, then we know that . In general it is not true that , then and are independent
We can prove that
With this we can define the variance of a random vector composed of random variables with a finite variance. It is the square matrix
this matrix is symmetric.
We can get the identity, let :
The matrix is symmetric and positive definite
Expected Value of functions of a Random Vector
Let be a random vector and be Borel measurable function such that the random variable has a finite expected value. Then $$E[\varphi(X, Y) ] = \int_{\Bbb R^2} \varphi(x, y) dF_{X, Y}(x, y)$$
From this we can actually get that:
Let and have finite expected value, then $$E[X + Y] = E[X] + E[Y]$$
Let and be independent random variables, and let be Borel measurable functions such that and have finite expected values: