Newer
Older
MIT_OCW_Linear_Algebra_18_06 / II_08_Eigenvalues_and_eigenvectors.ipynb
juanklopper on 13 May 2015 29 KB MIT_OCW_Linear_Algebra_18_06
not connected
In [1]
from IPython.core.display import HTML, Image
css_file = 'style.css'
HTML(open(css_file, 'r').read())
Out [1]
In [2]
from sympy import init_printing, Matrix, symbols, eye
from warnings import filterwarnings
In [3]
init_printing(use_latex = 'mathjax')
filterwarnings('ignore')
In [4]
lamda = symbols('lamda') # Note that lambda is a reserved word in python, so we use lamda (without the b)

Eigenvalues and eigenvectors

What are eigenvectors?

  • A Matrix is a mathematical object that acts on a (column) vector, resulting in a new vector, i.e. Ax=b
  • An eigenvector is the resulting vector that is parallel to x (some multiple of x)
    $$ {A}\underline{x}=\lambda \underline{x} $$
  • The eigenvectors with an eigenvalue of zero are the vectors in the nullspace
  • If A is singular (takes some non-zero vector into 0) then λ=0

What are the eigenvectors and eigenvalues for projection matrices?

  • A projection matrix P projects some vector (b) onto a subspace (in 3-space we are talking about a plane through the origin)
  • Pb is not in the same direction as b
  • A vector x that is already in the subspace will result in Px=x, so λ=1
  • Another good x would be one perpendicular to the subspace, i.e. Px=0x, so λ=0

What are the eigenvectors and eigenvalues for permutation matrices?

  • A permutation matrix such as the one below changes the order of the elements in a (column) vector
    $$ \begin{bmatrix} 0 & 1 \ 1 & 0 \end{bmatrix} $$
  • A good example of a vector that would remain in the same direction after multiplication by the permutation matrix above would the following vector
    $$ \begin{bmatrix} 1 \ 1 \end{bmatrix} $$
  • The eigenvalue would just be λ=1
  • The next (eigen)vector would also work
    $$ \begin{bmatrix} -1 \ 1 \end{bmatrix} $$
  • It would have an eigenvalue of λ=-1

The trace and the determinant

  • The trace is the sum of the values down the main diagonal of a square matrix
  • Note how this is the same as the sum of the eigenvalues (look at the permutation matrix above and its eigenvalues)
  • The determinant of A is the product of the eigenvalues

How to solve Axx

$$ A\underline { x } =\lambda \underline { x } \ \left( A-\lambda I \right) \underline { x } =\underline { 0 } $$

  • The only solution to this equation is for A-λI to be singular and therefor have a determinant of zero
    $$ \left|{A}-\lambda{I}\right|=0 $$
  • This is called the characteristic (or eigenvalue) equation
  • There will be n λ's for a n×n matrix(some of which may be of equal value)
In [5]
A = Matrix([[3, 1], [1, 3]])
I = eye(2)
A, I # Printing A and the 2-by-2 identity matrix to the screen
Out [5]
$$\begin{pmatrix}\left[\begin{matrix}3 & 1\\1 & 3\end{matrix}\right], & \left[\begin{matrix}1 & 0\\0 & 1\end{matrix}\right]\end{pmatrix}$$
In [6]
(A - lamda * I) # Printing A minus lambda times the identity matrix to the screen
Out [6]
$$\left[\begin{matrix}- \lambda + 3 & 1\\1 & - \lambda + 3\end{matrix}\right]$$
  • This will have the following determinant
In [7]
(A - lamda * I).det()
Out [7]
$$\lambda^{2} - 6 \lambda + 8$$
  • For this 2×2 matrix the absolute value of the -6 is the trace of A and the 8 is the determinant of A
In [8]
((A - lamda * I).det()).factor()
Out [8]
$$\left(\lambda - 4\right) \left(\lambda - 2\right)$$
  • I now have two eigenvalues of 2 and 4
  • In python we could also use the .eigenvals() statement
In [9]
A.eigenvals() # There is one value of 2 and one value of 4
Out [9]
$$\begin{Bmatrix}2 : 1, & 4 : 1\end{Bmatrix}$$
  • The eigenvectors are calculated by substituting the two values of λ into the original equation
    $$ \left( {A}-\lambda{I} \right)\underline{x}=\underline{0} $$
In [10]
A.eigenvects()
Out [10]
$$\begin{bmatrix}\begin{pmatrix}2, & 1, & \begin{bmatrix}\left[\begin{matrix}-1\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}4, & 1, & \begin{bmatrix}\left[\begin{matrix}1\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}\end{bmatrix}$$
  • The results above is interpreted as follows
    • The first eigenvalue has one eigenvector and the second eigenvalue also has a single eigenvector
  • Note the similarity between the eigenvectors of the two examples above
  • It is easy to see that adding a constant multiple of the identity matrix to another matrix (above we added 3I to the initial matrix) doesn't change the eigenvectors; it does add that constant to the eigenvalues though (we went from -1 and 1 to 2 and 4)
    $$ A\underline { x } =\lambda \underline { x } \ \therefore \quad \left( A+cI \right) \underline { x } =\left( \lambda +c \right) \underline { x } $$
  • If we add another matrix to A (not a constant multiple of I) or even multiply them, then the influence on the original eigenvalues and eigenvectors of A is NOT so predictable (as above)

The eigenvalues and eigenvectors of a rotation matrix

  • Consider this rotation matrix that rotates a vector by 90o (it is orthogonal)
    • Think about it, though: what vector can come out parallel to itself after a 90o rotation?
In [11]
Q = Matrix([[0, -1], [1, 0]])
Q
Out [11]
$$\left[\begin{matrix}0 & -1\\1 & 0\end{matrix}\right]$$
  • From the trace and determinant above we know that we will have the following equation
    $$ {\lambda}^{2}-{0}{\lambda}+{1}={0} \ {\lambda}^{2}=-{1} $$
In [12]
Q.eigenvals()
Out [12]
$$\begin{Bmatrix}- i : 1, & i : 1\end{Bmatrix}$$
In [13]
Q.eigenvects()
Out [13]
$$\begin{bmatrix}\begin{pmatrix}- i, & 1, & \begin{bmatrix}\left[\begin{matrix}- i\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}i, & 1, & \begin{bmatrix}\left[\begin{matrix}i\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}\end{bmatrix}$$
  • Note how the eigenvalues are complex conjugates
  • Symmetric matrices will only have real eigenvalues
  • An anti-symmetric matrix (where the transpose is the original matrix times the scalar -1, as our example above) will only have complex eigenvalues
  • Matrices in between can have a mix of these

Eigenvalues and eigenvectors of an upper triangular matrix

  • Compute the eigenvalues and eigenvectors of the following matrix (note it is upper triangular)
In [14]
A = Matrix([[3, 1], [0, 3]])
A
Out [14]
$$\left[\begin{matrix}3 & 1\\0 & 3\end{matrix}\right]$$
In [15]
A.eigenvals()
Out [15]
$$\begin{Bmatrix}3 : 2\end{Bmatrix}$$
  • We have two eigenvalues, both equal to 3
In [16]
A.eigenvects()
Out [16]
$$\begin{bmatrix}\begin{pmatrix}3, & 2, & \begin{bmatrix}\left[\begin{matrix}1\\0\end{matrix}\right]\end{bmatrix}\end{pmatrix}\end{bmatrix}$$
  • This is a degenerate matrix; it does not have independent eigenvectors
  • Look at this upper triangular matrix
In [17]
A = Matrix([[3, 1, 1], [0, 3, 4], [0, 0, 3]])
A
Out [17]
$$\left[\begin{matrix}3 & 1 & 1\\0 & 3 & 4\\0 & 0 & 3\end{matrix}\right]$$
In [18]
A.eigenvals()
Out [18]
$$\begin{Bmatrix}3 : 3\end{Bmatrix}$$
In [19]
A.eigenvects()
Out [19]
$$\begin{bmatrix}\begin{pmatrix}3, & 3, & \begin{bmatrix}\left[\begin{matrix}1\\0\\0\end{matrix}\right]\end{bmatrix}\end{pmatrix}\end{bmatrix}$$

Example problems

Example problem 1

  • Find the eigenvalues and eigenvectors of the square of the following matrix as well as the inverse of the matrix minus the identity matrix
    $$ {A}=\begin{bmatrix} 1 & 2 & 3 \ 0 & 1 & -2 \ 0 & 1 & 4 \end{bmatrix} $$

Solution

  • Notice the following
    $$ A\underline { x } =\lambda \underline { x } \ { A }^{ 2 }\underline { x } =A\left( A\underline { x } \right) =A\left( \lambda \underline { x } \right) =\lambda \left( A\underline { x } \right) ={ \lambda }^{ 2 }\underline { x } $$
  • Once we know the eigenvalues for A we than simply square them to get the eigenvalues of the matrix squared
  • Similarly for the inverse of the matrix we have the following (for a non-zero λ, which is fine as A must be invertible for this problem)
    $$ { A }^{ -1 }\underline { x } ={ A }^{ -1 }\frac { A\underline { x } }{ \lambda } ={ A }^{ -1 }A\frac { 1 }{ \lambda } \underline { x } =\frac { 1 }{ \lambda } \underline { x} $$
In [20]
A = Matrix([[1, 2, 3], [0, 1, -2], [0, 1, 4]])
A
Out [20]
$$\left[\begin{matrix}1 & 2 & 3\\0 & 1 & -2\\0 & 1 & 4\end{matrix}\right]$$
In [21]
A.eigenvals()
Out [21]
$$\begin{Bmatrix}1 : 1, & 2 : 1, & 3 : 1\end{Bmatrix}$$
In [22]
A.eigenvects()
Out [22]
$$\begin{bmatrix}\begin{pmatrix}1, & 1, & \begin{bmatrix}\left[\begin{matrix}1\\0\\0\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}2, & 1, & \begin{bmatrix}\left[\begin{matrix}-1\\-2\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}3, & 1, & \begin{bmatrix}\left[\begin{matrix}\frac{1}{2}\\-1\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}\end{bmatrix}$$
  • From this it is clear that the eigenvalues of A2 will be 1, 4, and 9 and for A-1 would be a 1, a half and a third
In [23]
(A ** 2).eigenvals()
Out [23]
$$\begin{Bmatrix}1 : 1, & 4 : 1, & 9 : 1\end{Bmatrix}$$
In [24]
(A.inv()).eigenvals()
Out [24]
$$\begin{Bmatrix}\frac{1}{3} : 1, & \frac{1}{2} : 1, & 1 : 1\end{Bmatrix}$$
  • The eigenvectors will be as follows (exactly the same)
In [25]
(A ** 2).eigenvects()
Out [25]
$$\begin{bmatrix}\begin{pmatrix}1, & 1, & \begin{bmatrix}\left[\begin{matrix}1\\0\\0\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}4, & 1, & \begin{bmatrix}\left[\begin{matrix}-1\\-2\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}9, & 1, & \begin{bmatrix}\left[\begin{matrix}\frac{1}{2}\\-1\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}\end{bmatrix}$$
In [26]
(A.inv()).eigenvects()
Out [26]
$$\begin{bmatrix}\begin{pmatrix}\frac{1}{3}, & 1, & \begin{bmatrix}\left[\begin{matrix}\frac{1}{2}\\-1\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}\frac{1}{2}, & 1, & \begin{bmatrix}\left[\begin{matrix}-1\\-2\\1\end{matrix}\right]\end{bmatrix}\end{pmatrix}, & \begin{pmatrix}1, & 1, & \begin{bmatrix}\left[\begin{matrix}1\\0\\0\end{matrix}\right]\end{bmatrix}\end{pmatrix}\end{bmatrix}$$
In [ ]