Linear Vector Spaces: Euclidean Vector Spaces
In these pages, a Euclidean Vector Space is used to refer to an  dimensional linear vector space equipped with the Euclidean norm, the Euclidean metric and the Euclidean dot product functions. These functions allow the definition of orthonormal basis sets, orthogonal projections and the cross product operation.
 dimensional linear vector space equipped with the Euclidean norm, the Euclidean metric and the Euclidean dot product functions. These functions allow the definition of orthonormal basis sets, orthogonal projections and the cross product operation.
Orthonormal Basis
An orthonormal basis set is a basis set whose vectors satisfy two conditions. The first condition is that the vectors in the basis set are orthogonal to each other and the second condition is that each vector has a unit norm.
Let  be an orthonormal basis set for
 be an orthonormal basis set for  . Since
. Since  is a basis set, we know that
 is a basis set, we know that  . Since
. Since  is an orthonormal basis set, the components
 is an orthonormal basis set, the components  can be obtained using the dot product so that:
 can be obtained using the dot product so that:
      ![Rendered by QuickLaTeX.com \[ x=(x\cdot e_1)e_1+(x\cdot e_2)e_2+\cdots+(x\cdot e_n)e_n \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-c1c1fbec7973dff0ce26cab8551d7248_l3.png)
Orthogonal Projection
The dot product structure allows the definition of orthogonal projections. Given a vector  , the orthogonal projection allows the unique additive decomposition of any vector
, the orthogonal projection allows the unique additive decomposition of any vector  into two vectors
 into two vectors  and
 and  where
 where  is in the direction of
 is in the direction of  and
 and  is orthogonal to
 is orthogonal to  . The vector
. The vector  is called the orthogonal projection of
 is called the orthogonal projection of  onto
 onto  . It can be easily shown that
. It can be easily shown that  and
 and  are equal to:
 are equal to:
      ![Rendered by QuickLaTeX.com \[ a=\left({x\cdot y \over \|y\|^2}\right)y\hspace{10mm} b=x-a \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-fb4b9d1ab4da3f07079a77d989aa6ff6_l3.png)
Proof
Let  . Since
. Since  is in the direction of
 is in the direction of  , then
, then  such that
 such that  . Since
. Since  is orthogonal to
 is orthogonal to  we have:
 we have:
 .
.

In the following tool, enter the components of the vectors  and
 and  . The tool draws the vectors
. The tool draws the vectors  and
 and  in black and blue respectively. The vectors
 in black and blue respectively. The vectors  and
 and  are calculated as above and drawn using dotted black arrows. Notice that the orthogonal projection of
 are calculated as above and drawn using dotted black arrows. Notice that the orthogonal projection of  onto
 onto  is not equal to the orthogonal projection of
 is not equal to the orthogonal projection of  onto
 onto  .
.
Cross Product in 
The structure of the Euclidean vector space  allows the definition of the  cross product operation. This is a unique map that gives the vector perpendicular to any two linearly independent vectors.
 allows the definition of the  cross product operation. This is a unique map that gives the vector perpendicular to any two linearly independent vectors.
The cross product is the operation:  satisfying the following properties
 satisfying the following properties  :
:
1. Denoting  , the resulting vector
, the resulting vector  is orthogonal to the two vectors
 is orthogonal to the two vectors  and
 and 
      ![Rendered by QuickLaTeX.com \[ z\cdot u=z\cdot v = 0 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-d6870f35de12997ed59580740d1d7051_l3.png)
2. The operation is skewsymmetric:
      ![Rendered by QuickLaTeX.com \[ u\times v= -v \times u \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-68c62c0ddec751f1ebc4bd63c713073e_l3.png)
3. The operation is distributive over addition:
      ![Rendered by QuickLaTeX.com \[ u\times(v+w)=u\times v + u\times w \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-f3073ac3a85e241cd7a95494d1d79118_l3.png)
4. The operation is compatible with scalar multiplication:
      ![Rendered by QuickLaTeX.com \[ (\alpha u)\times v=\alpha(u\times v)=u\times(\alpha v) \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-37b5b73deaab53fc839102310baf3cfb_l3.png)
5. 
The last property ensures that the norm of the resulting vector  is equal to the area of the parallelogram formed by the two vectors
 is equal to the area of the parallelogram formed by the two vectors  and
 and  .
.
The cross product operation is defined above using its algebraic properties. Equivalently, the cross product can be defined as follows, given  :
:
      ![Rendered by QuickLaTeX.com \[ z:=u\times v = \|u\|\|v\|\sin\theta n \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-ee6885a0992278bb793ca8581185cbb3_l3.png)
where  is the geometric angle between
 is the geometric angle between  and
 and  while
 while  is a unit vector orthogonal to both
 is a unit vector orthogonal to both  and
 and  in the direction given by the right-hand rule.
 in the direction given by the right-hand rule.
In the following, the algebraic properties of the cross product are used to show the traditional properties of the cross product given an orthonormal basis set  :
:
The cross product of the basis vectors
      ![Rendered by QuickLaTeX.com \[ e_1\times e_2=\pm e_3\hspace{10mm}e_2\times e_3=\pm e_1\hspace{10mm}e_3\times e_1=\pm e_2 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-74af513a43989584b1a25b8f0cf1afbc_l3.png)
Where the positive sign is used to indicate a right-handed orientation.
Proof
This is straightforward from properties 1 and 5 above. From property 1, since  is perpendicular to both
 is perpendicular to both  and
 and  , then
, then  .
.
To find the value of  , we use the last property as follows:
, we use the last property as follows:
      ![Rendered by QuickLaTeX.com \[ \|z\|^2=\alpha^2=(e_1\cdot e_1)(e_2\cdot e_2)-(e_1\cdot e_2)^2=1\Rightarrow \alpha=\pm 1 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-91fd0b9169efa1cacad7cd6e2adff724_l3.png)

The cross product of linearly dependent vectors
      ![Rendered by QuickLaTeX.com \[ u\times v = 0 \Leftrightarrow u=0 \text{ or } v=0 \text{ or } u=\alpha v \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-7ca18cc6cc65409cba78d3e926172377_l3.png)
Indicating that if  and
 and  are linearly dependent, then their cross product is equal to zero.
 are linearly dependent, then their cross product is equal to zero.
Proof
We first assume that either  . From property 4 above, if
. From property 4 above, if  or
 or  is equal to
 is equal to  then, the cross product is equal to the zero vector. From property 5, if
 then, the cross product is equal to the zero vector. From property 5, if  , then:
, then:
      ![Rendered by QuickLaTeX.com \[ \|z\|^2=(\alpha v\cdot \alpha v)(v \cdot v)-(\alpha v\cdot v)^2=\alpha^2 \left((v\cdot v)^2-(v\cdot v)^2\right)=0\Rightarrow z=0 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-d726beadb0b9e404490281504732fe7f_l3.png)
Then, we assume that  and that
 and that  and
 and  are non-zero vectors and that
 are non-zero vectors and that  and
 and  are not linearly dependent. Using the orthogonal projection defined above,
 are not linearly dependent. Using the orthogonal projection defined above,  and
 and  such that
 such that  and
 and  . Then:
. Then:
      ![Rendered by QuickLaTeX.com \[ \|z\|^2=(u\cdot u)(v \cdot v)-(u\cdot v)^2=((\alpha v + b)\cdot (\alpha v + b))(v \cdot v)-((\alpha v + b)\cdot v)^2=(b\cdot b)(v\cdot v)\neq 0 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-bf2c1f698b6f121889fa03b4a9ec2667_l3.png)
which is a contradiction, therefore,  with
 with 

The explicit representation of the cross product
      ![Rendered by QuickLaTeX.com \[ u\times v = (u_2v_3-u_3v_2)e_1+(u_3v_1-u_1v_3)e_2+(u_1v_2-u_2v_1)e_3 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-a8d827c692e65ad33b68afe334bca351_l3.png)
Proof
This is a direct consequence of property 3 above and the cross product of the basis vectors result.

The triple product 
The triple product of any three vectors  satisfies:
 satisfies:
      ![Rendered by QuickLaTeX.com \[ u\cdot (v\times w)=v\cdot (w\times u)=w\cdot (u\times v)=-u\cdot (w\times v)=-v\cdot (u\times w)=-w\cdot (v\times u) \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-373ddb7e29d3947db1582a345cc38628_l3.png)
Proof
This is a direct consequence of properties 1 and 2 above as follows:
      ![Rendered by QuickLaTeX.com \[ (u+v)\cdot ((u+v)\times w) =0\Rightarrow u\cdot(v\times w)+v\cdot(u\times w)=0\Rightarrow u\cdot (v\times w)=-v\cdot (u\times w)=v\cdot (w\times u) \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-197657c51fc176b1f7695f39cc7e6b20_l3.png)
The remaining equalities can be proven similarly

The triple product  of linearly dependent vectors
 of linearly dependent vectors
The triple product of any three vectors  satisfies:
 satisfies:
      ![Rendered by QuickLaTeX.com \[ u\cdot (v\times w)=0 \Leftrightarrow u,v,w \text{ are linearly dependent} \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-0290de90b4e599659c17364a9494934e_l3.png)
Proof
The one direction is straight forward, if  are linearly dependent, then there is a non trivial combination such that:
 are linearly dependent, then there is a non trivial combination such that:  . Without loss of generality, we can assume that
. Without loss of generality, we can assume that  then, from properties 1 and 3 we get:
 then, from properties 1 and 3 we get:
      ![Rendered by QuickLaTeX.com \[ \left(\frac{\alpha_2}{\alpha_1}v+\frac{\alpha_3}{\alpha_1}w\right)\cdot(v\times w)=\frac{\alpha_2}{\alpha_1}v\cdot(v\times w)+\frac{\alpha_3}{\alpha_1}w\cdot(v\times w)=0 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-c15fe4d4efe60c00a3f5bfb905a86e8a_l3.png)
In the opposite direction we will argue by contradiction. Assuming that  and
 and  are linearly independent, then they form a basis in
 are linearly independent, then they form a basis in  . Assuming that
. Assuming that  , then,
, then,  is orthogonal to
 is orthogonal to  . In addition, from property 1,
. In addition, from property 1,  is orthogonal to
 is orthogonal to  and
 and  . Therefore,
. Therefore,  is orthogonal to every vector in the space including itself! This is a contradiction (why?). Therefore,
 is orthogonal to every vector in the space including itself! This is a contradiction (why?). Therefore,  and
 and  are linearly dependent.
 are linearly dependent.

The triple product  of orthonormal vectors
 of orthonormal vectors
If  are orthonormal then:
 are orthonormal then:
      ![Rendered by QuickLaTeX.com \[ u\cdot (v\times w)=\pm 1 \]](https://engcourses-uofa.ca/wp-content/ql-cache/quicklatex.com-75b59632ed87730641480ae149af884d_l3.png)
Proof
Similar to the proof for the cross product of the basis vectors.

Exercise:
Show that  . Where
. Where  is the angle between
 is the angle between  and
 and  .
.
The following tool calculates the cross product  . The vectors
. The vectors  ,
,  and
 and  are drawn in blue, red and black respectively. The plane joining
 are drawn in blue, red and black respectively. The plane joining  and
 and  is highlighted. Notice that
 is highlighted. Notice that  is orthogonal to that plane.
 is orthogonal to that plane.
