Special Types of Linear Maps: Symmetric Tensors
Learning Outcomes
- Identify the geometric function of a symmetric tensor.
- Explain that a tensor from an n dimensional vector space to an n dimensional vector space is symmetric if and only if it possesses n real eigenvalues associated with n orthonormal eigenvectors.
- Compute the process of diagonalization of a symmetric matrix. Recognize the geometric meaning of the process as being a simple change of basis using the eigenvectors of the symmetric matrix.
- Identify that computing the invariants of the diagonalized form of a symmetric matrix is much simpler than computing them in a general form.
Symmetric Tensor Definition
Let . is called a symmetric tensor if .
The following properties can be naturally deduced from the definition of symmetric tensors:
- In component form, the matrix representation of is such that
- we have:
- is symmetric. In particular, if is an orthogonal matrix associated with a coordinate transformation, then the matrix representation of stays symmetric in any coordinate system.
Symmetric Tensors Possess Real Eigenvalues and an Orthonormal Set of Eigenvectors
Symmetric tensors form a very important class of tensors that appear in many engineering applications. The following assertion leads to the simplification of the study of symmetric tensors.
ASSERTION:
A tensor is symmetric if and only if it possesses real eigenvalues associated with orthonormal eigenvectors.
PROOF:
Let be the space of complex numbers. First we will recall a few facts from complex analysis: If , then, such that where . The conjugate of , namely is defined as . The product . Additionally, .
If is an eigenvalue of , then, it is a solution to the degree polynomial equation . In general, this equation has solutions (roots) and some of these roots can be complex. We will now argue by contradiction: Assuming that is a complex eigenvalue of , then such that:
Taking the conjugate of the above relation:
If we then take the dot product of the first equation with and the second equation with we get:
But, since is symmetric we have:
However, we have . Therefore, and therefore, .
The next step is to show that we can find a set of orthonormal eigenvectors associated with the real eigenvalues of . We first show this for any two distinct eigenvalues . Let and be the corresponding eigenvectors, then:
But since , then is orthogonal to .
Next, we assume that there is an eigenvalue with multiplicity , i.e.,
(1)
where is a real valued function. We need to show that there is a set of orthogonal (linearly independent) vectors associated with the eigenvalue .
Let be an eigenvector associated with . We can choose a set of vectors that forms with an orthonormal basis set. Then, the following is the coordinate transformation matrix to change from the basis set to :
If is the matrix representation of in the coordinate system of , then the components of can be evaluated as follows:
Therefore:
Notice that and . Therefore:
The above expression can be simplified if we consider the following matrix:
Then, can be written in the following simplified form:
From the properties of the determinant, the characteristic equation of is:
Where is the identity matrix defined on the subspace of the basis vectors . It follows from (1) that is an eigenvalue for the matrix , i.e., it is an eigenvalue of . Let be the corresponding eigenvector, then . Therefore, . Therefore, has at least two perpendicular eigenvectors. By repeating the above argument, we can find orthogonal eigenvectors associated with . Note that it is not possible to find more than orthogonal eigenvectors associated with , otherwise this would lead to the contradictory conclusion that there are more than orthogonal eigenvectors in .
Note that the choice of the eigenvectors is not necessarily unique since if is an eigenvector of , then so is for all nonzero real numbers . In addition, if and are two orthogonal eigenvectors associated with an eigenvalue , then there are infinite choices of sets of two orthonormal vectors associated with the eigenvalue (why?).
It is now left to show that if has real eigenvalues associated with orthonormal eigenvectors, then is symmetric. Let be the set of orthonormal eigenvectors associated (respectively) with the the set of eigenvalues , then and . Therefore:
Similarly:
Therefore, is symmetric.
◼
Representation of Symmetric Tensors in
From the previous section, possesses three perpendicular eigenvectors, say and associated with the eigenvalues and respectively. Assuming that and form a right handed orthonormal basis set in then, . Then:
Therefore, admits the representation:
(2)
Note that this representation is not restricted to but can be extended to any finite dimensional vector space .Diagonalization of Symmetric Matrices in
The representation of a symmetric tensor shown in (2) implies that if a coordinate system of the eigenvectors and is chosen, then admits a diagonal matrix representation.
This is easy to see when we consider a coordinate transformation from the basis set to the basis set . The matrix of transformation has the form:
If is the representation of in the coordinate system described by and then:
Utilizing the identities: and :
Since and then:
Note that this representation is not restricted to but can be extended to any finite dimensional vector space .
Principal Invariants of a Symmetric Matrix in
The diagonalization described in the previous section of a symmetric matrix allows expressing the three principal invariants of a symmetric matrix in terms of the three eigenvalues and as follows:
Change the entries for the components of the symmetric matrix and the tool will find the eigenvalues, eigenvectors and the new coordinate system in which is diagonal. The coordinate system used is illustrated with thick arrows:
Change the entries for the components of the symmetric matrix and the tool will find the eigenvalues, eigenvectors and the new coordinate system in which is diagonal:
Geometric Representation of Symmetric Matrices
The geometric function of a symmetric matrix is to stretch an object along the principal direction (eigenvectors) of the matrix.
The following example illustrates the action of a symmetric matrix on the vectors forming a circle of radius 1 to transform the circle into an ellipse with major and minor radii equal to the eigenvalues of . The blue and red arrows show the eigenvectors of which upon transformation, do not change direction but change their length according to the corresponding eigenvalues. The black solid arrows show the vectors and while the black dotted arrows show the vectors and . Notice what happens to the orientation of the vectors when . Can you also find a combination of components producing ? What happens to the transformed circle?
Similarly, the following example illustrates the action of a symmetric tensor on the vectors forming a sphere of radius 1 to transform the sphere into an ellipsoid with radii equal to the eigenvalues of .
Positive Definite and Semi-Positive definite Symmetric Tensors
A symmetric tensor is positive definite if .
A symmetric tensor is semi-positive definite if .
ASSERTION:
Let be symmetric, then is positive (semi-positive) definite if and only if in the set of eigenvalues of
PROOF:
Left as an exercise