Matrix Transformation and Eigen value

$\spadesuit$Transition Matrix $\spadesuit$

Given a linear mappling $T$, then how to choose the basis of ${\mathcal R}^{n}$ and ${\mathcal R}^{m}$, the matrix representation of $T$ changes. In this section, we study the relationship between the matrix called a transition matrix which maps from the basis of ${\mathcal R}^{n}$ to the basis of ${\mathcal R}^{m}$ and the matrix representation.

Example 3..5  

Let $T : {\mathcal R}^{2} \longrightarrow {\mathcal R}^{2}$ be given by $T\left(\begin{array}{c}
x\\
y
\end{array}\right) = \left(\begin{array}{c}
3x - 4y\\
x + 5y
\end{array}\right)$. Find the transition matrix $P$ which maps from the usual basis of $\{{\bf e}_{1},{\bf e}_{2}\}$ of $R^{2}$ to the basis $\{{\bf w}_{1} = \left(\begin{array}{r}
1\\
3
\end{array}\right), {\bf w}_{2} = \left(\begin{array}{r}
2\\
5
\end{array}\right) \}$. For $A = [T]_{\bf e}$, find $P^{-1}AP$.

Answer First of all, we find the matrix so that $P{\bf e}_{1} = {\bf w}_{1}, P{\bf e}_{2} = {\bf w}_{2}$.

$\displaystyle {\bf w}_{1} = \left(\begin{array}{r}
1\\
3
\end{array}\right) = {\bf e}_{1} + 3{\bf e}_{2}, $

$\displaystyle {\bf w}_{2} = \left(\begin{array}{r}
2\\
5
\end{array}\right) = 2{\bf e}_{1} + 5{\bf e}_{2} $

Thus the transition matrix $P$ is given by

$\displaystyle P = \left(\begin{array}{rr}
1&2\\
3&5
\end{array}\right). $

Next we find $P^{-1}AP$. By Example 3.1, $A = [T]_{\bf e} = \left(\begin{array}{cc}
3 & -4\\
1 & 5
\end{array}\right)$. Thus,

$\displaystyle P^{-1}AP = \left(\begin{array}{rr}
-5&2\\
3&-1
\end{array}\right...
...n{array}{rr}
77&124\\
-43&-69
\end{array}\right).
\ensuremath{ \blacksquare}
$

Look this example carefully. Then $P^{-1}[T]_{\bf e}P = [T]_{\bf w}$. Is this always true?

Theorem 3..5  

Let $A$ be a matrix representation of the linear transformation $T : {\mathcal R}^{n} \longrightarrow {\mathcal R}^{n}$ relative to the basis $\{{\bf v}_{i}\}$ and $B$ be a matrix representation relative to the basis $\{{\bf w}_{i}\}$. Then the transition matrix $P$ from the basis $\{{\bf v}_{i}\}$ to the basis $\{{\bf w}_{i}\}$ satisfies $B = P^{-1}AP$.

Proof By Exercise3.2, the transposition matrix $P$ from the basis $\{{\bf v}_{i}\}_{i=1}^{n}$ to the basis $\{{\bf w}_{i}\}_{i=1}^{n}$ is regular matrix of the order $n$ and $P = (p_{ij})$ is given by the following:

$\displaystyle {\bf w}_{j} = p_{1j}{\bf v}_{1} + \cdots + p_{nj}{\bf v}_{n},  (j = 1,2,\ldots, n) $

. Suppose that $A = (a_{ij}), B = (b_{ij})$. Then

$\displaystyle T({\bf v}_{i}) = a_{1i}{\bf v}_{1} + \cdots + a_{mi}{\bf v}_{m},  (i = 1,2,\ldots,n) , $

$\displaystyle T({\bf w}_{j}) = b_{1j}{\bf w}_{1} + \cdots + b_{mj}{\bf w}_{m},  (j = 1,2,\ldots,n) $

implies that

$\displaystyle (T({\bf v}_{1}),\ldots,T({\bf v}_{n})) = ({\bf v}_{1},\ldots,{\bf v}_{n})A, $

$\displaystyle (T({\bf w}_{1}),\ldots,T({\bf w}_{n})) = ({\bf w}_{1},\ldots,{\bf w}_{n})B $

Then
$\displaystyle (T({\bf w}_{1}),\ldots,T({\bf w}_{n}))$ $\displaystyle =$ $\displaystyle (T({\bf v}_{1}),\ldots,T({\bf v}_{n}))P$  
  $\displaystyle =$ $\displaystyle (({\bf v}_{1},\ldots,{\bf v}_{n})A)P$  
  $\displaystyle =$ $\displaystyle ({\bf v}_{1},\ldots,{\bf v}_{n})AP$  
  $\displaystyle =$ $\displaystyle (({\bf w}_{1},\ldots,{\bf w}_{n})P^{-1})(AP)$  
  $\displaystyle =$ $\displaystyle ({\bf w}_{1},\ldots,{\bf w}_{n})(P^{-1}AP)$  

Thus we have

$\displaystyle P^{-1}AP = B $

. $ \blacksquare$

Suppose that $A$ and $B$ are square matrices for which there exists an invertible matrix $P$ such that $P^{-1}AP = B$. Then $B$ is said to be similar and denoted by $A \sim B$.

In the rest of the chapter, we study how to find a manageable form of matrix by choosing the regular matrix $P$ so that $P^{-1}AP$ is canonical form.

$\spadesuit$Eigenvalues and Eigenvectors $\spadesuit$

In this chapter we investigate the theory of a single linear operator $T$ on a vector space $V$ of finite dimension. In particular, we find conditions under which $T$ is diagonalizable.

A set complex numbers is denoted by ${ C}$ and the set of $n$ complex numbers is denoted by ${ C}^{n} = \{(c_{1},c_{2},\ldots,c_{n}), c_{i} \in { C} \}$.

Let $A$ be a square matrix of the order $n$. Then for $\lambda \in { C}$,

$\displaystyle A{\mathbf x} = \lambda {\mathbf x}  ({\mathbf x} \in { C}^{n}, {\mathbf x} \neq {\bf0}). $

Then we say $\lambda$ is a eigenvalue of $A$ and ${\mathbf x}$ is a eigenvector of $A$ corresponds to $\lambda$.

For example, consider the linear transformation $T$ which maps the line $y = x + 2$to the line $Y = X$. This is a translation in the $y$ direction. Thus the vector $\left(\begin{array}{c}
0\\
y
\end{array}\right)$ is translated to $\lambda \left(\begin{array}{c}
0\\
y
\end{array}\right)$. The scalar $\lambda$ is the eigenvalue. Now we study how to obtain an eigenvalue and eigenvector.

Rewrite the equation $A{\mathbf x} = \lambda {\mathbf x}$. Then we have

$\displaystyle (A - \lambda I){\mathbf x} = {\bf0} $

This is homogeneous system. The necessary and sufficient conditions for which the eigen vector ${\mathbf x}$ exists is the system of linear equation has nonzero solution. By the theorem 2.5,

$\displaystyle \vert A - \lambda I\vert = 0 $

Thus, the eigenvalue $\lambda$ can be found by the solution of the following equation with $t$ unknown:

$\displaystyle \vert A - tI\vert = 0 $

Conversely, The solution $\lambda$ is the eigenvalue of $A$. Now the polynomial in $t$

$\displaystyle \Phi_{A}(t) = \det(A - tI) = t^{n} + c_{1}t^{n-1} + \cdots + c_{n} $

is called an characteristic polynomial of $A$ and the equation in $t$

$\displaystyle \Phi_{A}(t) = \det(A - tI) = 0 $

is called an characteristic equation of $A$. From this, to find an eigenvalue, it is enough to solve the characteristic equation.

Example 3..6  

Find all eigenvalues of $A = \left(\begin{array}{rrr}
3&0&0\\
0&2&-5\\
0&1&-2
\end{array}\right)$.

Answer $\Phi_{A}(t) = \left \vert\begin{array}{rrr}
3-t & 0 & 0\\
0 & 2-t & -5\\
0 & 1 & -2-t
\end{array}\right \vert = (3-t)(t^{2} + 1) = 0$. Thus the eigen values of $A$ are $\lambda = 3, \pm i .$ $ \blacksquare$

As you can see even though the entries of the matrix are all real number, the eigenvalues might be complex numbers.

Example 3..7  

Find all eigenvalues and corresponding eigenvectors of $A = \left(\begin{array}{rrr}
0&1&1\\
1&0&1\\
1&1&0
\end{array}\right)$.

Answer $\Phi_{A}(t) = \left \vert\begin{array}{rrr}
-t & 1 & 1\\
1 & -t & 1\\
1 & 1 & -t
\end{array}\right \vert = -(t+1)^{2}(t-2).$ Thus, the eigenvalues of $A$ are $\lambda = 2,-1$. Now we find the eigenvector corresponding $\lambda$ of $A$.

For $\lambda = 2$, the eigenvector satisfies $(A - 2I){\mathbf x} = {\bf0}$ and nonzero. Solve this equation. We have

$\displaystyle A - 2I$ $\displaystyle =$ $\displaystyle \left(\begin{array}{rrr}
-2&1 & 1\\
1& -2 & 1\\
1&1&-2
\end{arr...
...htarrow}
\left(\begin{array}{rrr}
1&1&-2\\
1&-2&1\\
-2&1&1
\end{array}\right)$  
  $\displaystyle \stackrel{\begin{array}{cc}
{}^{-R_{1} + R_{2}}\\
{}^{2R_{1} + R_{3}}
\end{array}}{\longrightarrow}$ $\displaystyle \left(\begin{array}{rrr}
1&1&-2\\
0&-3&3\\
0 & 3&-3
\end{array}...
...rrow}
\left(\begin{array}{rrr}
1&1&-2\\
0&1&-1\\
0 & 0&0
\end{array}\right) .$  

Then the degree of freedom is 1. So let $x_{3} = \alpha$. Then

$\displaystyle {\mathbf x} = \left(\begin{array}{c}
x_{1}\\
x_{2}\\
x_{3}
\end...
...alpha \left(\begin{array}{c}
1\\
1\\
1
\end{array}\right)  (\alpha \neq 0). $

now for $\lambda = -1$, the eigenvector satisfies $(A - 3I){\mathbf x} = {\bf0}$ and nonzero. Solve this equation. We have

$\displaystyle A+ I = \left(\begin{array}{rrr}
1&1&1\\
1&1&1\\
1&1&1
\end{arra...
...htarrow} \left(\begin{array}{rrr}
1&1&1\\
0&0&0\\
0 & 0&0
\end{array}\right) $

Thus the degree of freedom is 2. So we let $x_{2} = \beta \neq 0, x_{3} = \gamma \neq 0$. Then

$\displaystyle {\mathbf x} = \left(\begin{array}{c}
-x_{2} - x_{3}\\
x_{2}\\
x...
...(\begin{array}{r}
-1\\
0\\
1
\end{array}\right).
\ensuremath{ \blacksquare}
$

$\spadesuit$Cayley-Hamilton Theorem $\spadesuit$

For $\Phi_{A}(t) = t^{n} + c_{1}t^{n-1} + \cdots + c_{n}$, define

$\displaystyle \Phi_{A}(A) = A^{n} + c_{1}A^{n-1} + \cdots + c_{n}I $

Then we have the following theorem. This theorem is called (Cayley-Hamilton theorem).

Theorem 3..6  

[Cayley-Hamilton Theorem] Every matrix is a zero its characteristic polynomial. In other words, $\Phi_{A}(A) = {\bf0}$.

Proof Let $A$ be an arbitrary $n$-square matrix and let $\Phi_{A}(t)$ be its characteristic polynonmial; say,

$\displaystyle \Phi_{A}(t) = \det(A - tI) = t^{n} + c_{1}t^{n-1} + \cdots + c_{n} $

Now let $B(t)$ denote the ajoint of the matrix $A - t I$. The elements of $B(t)$ are cofactors of the matrix $A - t I$ and hence are polynomials in $t$ of degree not exceeding $n-1$. Thus,

$\displaystyle B(t) = B_{1}t^{n-1} + B_{2}t^{n-2} + \cdots + B_{n} $

Here $B_{i}  (i = 1,2,\ldots,n)$ is $n$-square matrix. By the fundamental property of the classical ajoint

$\displaystyle (A - tI)B(t) = \Phi_{A}(t)I $

or

$\displaystyle (A - tI)(B_{1}t^{n-1} + B_{2}t^{n-2} + \cdots + B_{n}) = ( t^{n} + c_{1}t^{n-1} + \cdots + c_{n})I .$

Removing parentheses and equating the coefficients of corresponding powers ot $t$,

\begin{displaymath}\begin{array}{rrl}
-B_{1} &=& I,\\
-B_{2} + AB_{1} &=& c_{1}I,\\
\vdots&&\vdots\\
AB_{n}&=&c_{n}I
\end{array} \end{displaymath}

Multiplying the above matrix equations by $A^{n},A^{n-1},\ldots,I$ respectively,

$\displaystyle {\bf0} = A^{n} + c_{1}A^{n-1} + \cdots + c_{n}I = \Phi_{A}(A).
\ensuremath{ \blacksquare}
$

$\spadesuit$Eigenspace $\spadesuit$

Our purpose here is to find the regular matrix $P$ so that the matrix $A$ can be tranposed to a simpler matrix. In other words, to find the regular matrix $P$ such that $B = P^{-1}AP$.

The set of vectors such that

$\displaystyle V(\lambda) = \{{\mathbf x} : (A - \lambda I){\mathbf x} = {\bf0}\} $

is called an eigenspace corresponds to $\lambda$. This vector space is the same as the solution space formed by solution vectors ${\mathbf x}$ so that

$\displaystyle (A - \lambda I){\mathbf x} = {\bf0} $

Thus by the theorem 2.3, we have

$\displaystyle \dim V(\lambda) = n - {\rm rank}(A - \lambda I) $

Exercise3-4

1. Find the transposed matrix $P$ which maps the basis $\{{\bf v}_{1} = \left(\begin{array}{r}
1\\
-1
\end{array}\right), {\bf v}_{2} = \left(\begin{array}{r}
1\\
1
\end{array}\right) \}$ of ${\mathcal R}^{2}$ to the basis $\{{\bf w}_{1} = \left(\begin{array}{r}
3\\
1
\end{array}\right), {\bf w}_{2} = \left(\begin{array}{r}
-1\\
2
\end{array}\right) \}$.

2. Show that the transposed matrix $P$ which maps the basis $\{{\bf v}_{i}\}$ to the basis $\{{\bf w}_{i}\}$ of $R^{n}$ is regular.

3. Find all eigenvalues and all eigenvetors of the following matrices.

(a) $\left(\begin{array}{rr}
3&-1\\
1&1
\end{array}\right) $ (b) $\left(\begin{array}{rrr}
2&1&0\\
0&1&-1\\
0&2&4
\end{array}\right) $ (c) $\left(\begin{array}{rrr}
1&4&-4\\
-1&-3&2\\
0&2&-1
\end{array}\right) $

4. Find the eigenvalue of the square matrix $A$ which satisfies $A^{2} = A$

5. Let the eigenvalues of $A$ be $\lambda_{1},\lambda_{2},\ldots,\lambda_{n}$. Then show that the eigenvalues of $A^{m}$ are $\lambda_{1}^{m}, \lambda_{2}^{m}, . . , \lambda_{n}^{m}$.

6. Given $A = \left(\begin{array}{rr}
3&1\\
-1&1
\end{array}\right)$. Find $A^{4},A^{-1}$ using Cayley-Hamilton theorem.

7. Suppose $X$ is the matrix of order 2. Find all $X$ satisfying $X^{2} - 3X + 2I = 0$.