Eigenvalues and Eigenvectors

Return to the Math 2250 Overview Page.


Consider the following matrix multiplications.
    $$ \begin{bmatrix}
        2 & 7\\
        -1 & -6
    \end{bmatrix}  \begin{bmatrix}
        -1\\
        1
    \end{bmatrix}
    =
    \begin{bmatrix}
        5\\
        -5
    \end{bmatrix} $$ $$ \begin{bmatrix}
        2 & 7\\
        -1 & -6
    \end{bmatrix}
    \begin{bmatrix}
        -7\\
        1
    \end{bmatrix}
    =
    \begin{bmatrix}
        -7\\
        1
    \end{bmatrix} $$
  These two multiplications display a unique behavior to other matrix products. Matrices can be considered linear transformation operations performed on a vector. When multiplied by the matrix, these specific vectors did not change direction. They may have been scaled by some particular value. In other words, there was no significant linear transformation. These are known as eigenvectors. These are specific vectors that can be derived from a matrix. Eigenvectors are so named because they are stable or steady. Each vector has associated eigenvalues that scale the length of the vector.

Eigenvalues and Matrix Stability

Eigenvalues for a matrix can give information about the stability of the linear system. The following expression can be used to derive eigenvalues for any square matrix.

    $$
    det(A-\lambda I) = 
    \begin{bmatrix}
        n_0 & \cdots & n_f\\
        \cdots & \cdots & \cdots\\
        m_0 & \cdots & m_f
    \end{bmatrix}
    -
    \lambda I = 0
    $$
    Where \(A\) is any square matrix, \(I\) is an \(n\times n\) identity matrix of the same dimensionality of \(A\), and \(\Lambda\) is a symbolic eigenvalue. Consider the first matrix used in this section as an example.
    $$
    A =
    \begin{bmatrix}
        2 & 7\\
        -1 & -6
    \end{bmatrix}  $$  $$
    det(A-\lambda I) = 
    |\begin{bmatrix}
        2-\lambda & 7\\
        -1 & -6 -\lambda
    \end{bmatrix}|=0    $$    $$
    \lambda^2 +4\lambda -5 = 0    $$
    \[\{-5,1\}\in \lambda\]
    From this expression, we are able to find eigenvalues of 1 and -5. We can use these as well to determine associated eigenvectors. If we consider what is being solved, we can see that when eigenvalues are subtracted from the diagonal of a matrix, the matrix becomes singular. An eigenvector for an associated eigenvalue can be derived from the null space of the created singular matrices.
    $$
    M = A+5I = 
    \begin{bmatrix}
        7 & 7\\
        -1 & -1
    \end{bmatrix}    $$   $$
    N(M) = \bigg\{
    \begin{bmatrix}
        -1\\
        1
    \end{bmatrix} 
    \bigg\}    $$   $$
    M_2=A-I =
    \begin{bmatrix}
        1 & 7\\
        -1 & -7
    \end{bmatrix}   $$  $$
    N(M_2) = \bigg\{
    \begin{bmatrix}
        -7\\
        1
    \end{bmatrix} 
    \bigg\}   $$   $$
    V = 
    \begin{bmatrix}
        -1 & -7\\
        1 & 1
    \end{bmatrix}    $$
    We now have found the eigenspace of the matrix \(A\). These are also the vectors given at this section's beginning. These are known as eigenvectors, each has an associated eigenvalue.

Diagonalization of a Matrix

 Consider the following matrix and its associated Eigenvalues and Eigenvectors.
    $$
    A = \begin{bmatrix}
        3 & -2\\
        3 & -4
    \end{bmatrix} \hspace{6mm}
    \Lambda = \begin{bmatrix}
        -3 & 0\\
        0 & 2
    \end{bmatrix} \hspace{6mm}
    V = \begin{bmatrix}
        1 & 2\\
        3 & 1
    \end{bmatrix}
    $$
    If the eigenvector matrix \(V\) is multiplied by the matrix \(A\), the resulting matrix will be the eigenvector matrix modified by the eigenvalues.
    \[AV= V\Lambda\]
    From this, a form of the matrix \(A\) can be derived by multiplying by the inverse of the eigenvector matrix on the right.
    \[A = V\Lambda V^{-1}\]
    This process is known as diagonalizing the matrix. The given expression can be verified by computing the matrix product of \(V\), \(\Lambda\), and \(V^{-1}\).
    $$
    V^{-1} = 
    \begin{bmatrix}
        -1/5 & 2/5\\
        3/5 & -1/5
    \end{bmatrix}
$$
$$
    A = \begin{bmatrix}
        1 & 2\\
        3 & 1
    \end{bmatrix}
    \begin{bmatrix}
        -3 & 0\\
        0 & 2
    \end{bmatrix}
    \begin{bmatrix}
        -1/5 & 2/5\\
        3/5 & -1/5
    \end{bmatrix}
    $$
    This expression can also be written in the form of a set of rank 1 matrices.
    $$
    A = -3\begin{bmatrix}
        1\\
        3
    \end{bmatrix}
    \begin{bmatrix}
        -1/5 & 2/5
    \end{bmatrix}
    + 2\begin{bmatrix}
        2\\
        1
    \end{bmatrix}
    \begin{bmatrix}
        3/5 & -1/5
    \end{bmatrix}
    $$
    Computing each vector product and adding the results yields the original matrix.
    $$
    A = -3\begin{bmatrix}
        -1/5 & -2/5\\
        -3/5 & 6/5
    \end{bmatrix}
    +2\begin{bmatrix}
        6/5 & -2/5\\
        3/5 & -1/5
    \end{bmatrix} $$
$$    A = \begin{bmatrix}
        3/5 & -6/5\\
        9/5 & -18/5
    \end{bmatrix}
    +
    \begin{bmatrix}
        12/5 & -4/5\\
        6/5 & -2/5
    \end{bmatrix}
    =
    \begin{bmatrix}
        3 & -2\\
        3 & -4
    \end{bmatrix}
    $$

Solving Linear Systems with Eigenvalues

(Coming Soon)

Stability of Coupled Linear Systems

(Coming Soon)