Linear Vector Spaces: Euclidean Vector Spaces
In these pages, a Euclidean Vector Space is used to refer to an dimensional linear vector space equipped with the Euclidean norm, the Euclidean metric and the Euclidean dot product functions. These functions allow the definition of orthonormal basis sets, orthogonal projections and the cross product operation.
Orthonormal Basis
An orthonormal basis set is a basis set whose vectors satisfy two conditions. The first condition is that the vectors in the basis set are orthogonal to each other and the second condition is that each vector has a unit norm.
Let be an orthonormal basis set for . Since is a basis set, we know that . Since is an orthonormal basis set, the components can be obtained using the dot product so that:
Orthogonal Projection
The dot product structure allows the definition of orthogonal projections. Given a vector , the orthogonal projection allows the unique additive decomposition of any vector into two vectors and where is in the direction of and is orthogonal to . The vector is called the orthogonal projection of onto . It can be easily shown that and are equal to:
Proof
Let . Since is in the direction of , then such that . Since is orthogonal to we have:
.
In the following tool, enter the components of the vectors and . The tool draws the vectors and in black and blue respectively. The vectors and are calculated as above and drawn using dotted black arrows. Notice that the orthogonal projection of onto is not equal to the orthogonal projection of onto .
Cross Product in
The structure of the Euclidean vector space allows the definition of the cross product operation. This is a unique map that gives the vector perpendicular to any two linearly independent vectors.
The cross product is the operation: satisfying the following properties :
1. Denoting , the resulting vector is orthogonal to the two vectors and
2. The operation is skewsymmetric:
3. The operation is distributive over addition:
4. The operation is compatible with scalar multiplication:
5.
The last property ensures that the norm of the resulting vector is equal to the area of the parallelogram formed by the two vectors and .
The cross product operation is defined above using its algebraic properties. Equivalently, the cross product can be defined as follows, given :
where is the geometric angle between and while is a unit vector orthogonal to both and in the direction given by the right-hand rule.
In the following, the algebraic properties of the cross product are used to show the traditional properties of the cross product given an orthonormal basis set :
The cross product of the basis vectors
Where the positive sign is used to indicate a right-handed orientation.
Proof
This is straightforward from properties 1 and 5 above. From property 1, since is perpendicular to both and , then .
To find the value of , we use the last property as follows:
The cross product of linearly dependent vectors
Indicating that if and are linearly dependent, then their cross product is equal to zero.
Proof
We first assume that either . From property 4 above, if or is equal to then, the cross product is equal to the zero vector. From property 5, if , then:
Then, we assume that and that and are non-zero vectors and that and are not linearly dependent. Using the orthogonal projection defined above, and such that and . Then:
which is a contradiction, therefore, with
The explicit representation of the cross product
Proof
This is a direct consequence of property 3 above and the cross product of the basis vectors result.
The triple product
The triple product of any three vectors satisfies:
Proof
This is a direct consequence of properties 1 and 2 above as follows:
The remaining equalities can be proven similarly
The triple product of linearly dependent vectors
The triple product of any three vectors satisfies:
Proof
The one direction is straight forward, if are linearly dependent, then there is a non trivial combination such that: . Without loss of generality, we can assume that then, from properties 1 and 3 we get:
In the opposite direction we will argue by contradiction. Assuming that and are linearly independent, then they form a basis in . Assuming that , then, is orthogonal to . In addition, from property 1, is orthogonal to and . Therefore, is orthogonal to every vector in the space including itself! This is a contradiction (why?). Therefore, and are linearly dependent.
The triple product of orthonormal vectors
If are orthonormal then:
Proof
Similar to the proof for the cross product of the basis vectors.
Exercise:
Show that . Where is the angle between and .
The following tool calculates the cross product . The vectors , and are drawn in blue, red and black respectively. The plane joining and is highlighted. Notice that is orthogonal to that plane.