You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

Eigenvalues and eigenvectors are key concepts in linear algebra. They help us understand how matrices transform vectors and solve systems of linear equations. These tools are crucial for analyzing linear transformations and their effects on vector spaces.

In this part, we'll learn how to find eigenvalues and eigenvectors, and use them to diagonalize matrices. We'll also explore their applications in solving differential equations and analyzing system stability. These ideas connect linear algebra to other math and science fields.

Eigenvalues and eigenvectors of matrices

Definition and properties

  • An of a square matrix A is a nonzero vector x such that Ax = λx for some scalar λ
    • The scalar λ is called the corresponding to the eigenvector x
    • Together, the eigenvalue and eigenvector form an eigenpair (λ, x)
  • The set of all eigenvectors corresponding to a particular eigenvalue, together with the zero vector, form an
    • The dimension of the eigenspace is called the of the eigenvalue
  • A matrix can have repeated eigenvalues, meaning the same eigenvalue with multiple linearly independent eigenvectors
    • The number of times an eigenvalue appears as a root of the is called its
  • Not all square matrices have eigenvalues and eigenvectors
    • Matrices that do are called
    • A matrix is diagonalizable if and only if the sum of the dimensions of its eigenspaces equals the size of the matrix

Geometric interpretation

  • Geometrically, an eigenvector is a vector that, when the represented by the matrix is applied, remains parallel to its original direction
    • The eigenvalue represents the factor by which the eigenvector is scaled under the transformation
  • In a 2D plane, the eigenvectors of a matrix are the axes along which the matrix performs a simple scaling operation
    • These axes are not necessarily orthogonal (perpendicular) to each other unless the matrix is symmetric
  • In higher dimensions, eigenvectors can be thought of as defining a new coordinate system in which the matrix takes on a simpler form ()

Computing eigenvalues and eigenvectors

Characteristic equation

  • To find the eigenvalues of an n x n matrix A, solve the characteristic equation det(A - λI) = 0, where I is the n x n identity matrix
    • Expand the to obtain a polynomial in λ, called the characteristic polynomial
    • The degree of the characteristic polynomial equals the size of the matrix
  • The solutions to the characteristic equation (the roots of the characteristic polynomial) are the eigenvalues of the matrix A
    • The eigenvalues can be real or complex numbers
    • The sum of the eigenvalues equals the trace of the matrix (sum of the diagonal elements)
    • The product of the eigenvalues equals the determinant of the matrix

Finding eigenvectors

  • For each eigenvalue λ, solve the equation (A - λI)x = 0 to find the corresponding eigenvectors
    • This equation represents a homogeneous system of linear equations
    • The solution set of this system is the eigenspace corresponding to the eigenvalue λ
  • Eigenvectors are not unique; if x is an eigenvector, then any nonzero scalar multiple of x is also an eigenvector for the same eigenvalue
    • To find a basis for the eigenspace, solve the system (A - λI)x = 0 and express the solution in terms of free variables
  • Normalize eigenvectors by dividing them by their magnitude to obtain unit eigenvectors, which have a magnitude of 1
    • Unit eigenvectors are unique up to a sign change (multiplication by -1)

Matrix diagonalization using eigenvalues

Diagonalizability conditions

  • A square matrix A is diagonalizable if it has n linearly independent eigenvectors, where n is the size of the matrix
    • Equivalently, A is diagonalizable if the sum of the dimensions of its eigenspaces equals n
  • If A is diagonalizable, then A = PDP^(-1), where D is a diagonal matrix with the eigenvalues of A on its main diagonal, and P is a matrix whose columns are the corresponding eigenvectors of A
    • The eigenvectors in P must be linearly independent
    • If the eigenvectors are chosen to be unit vectors, then P is an orthogonal matrix (P^(-1) = P^T)
  • The process of finding matrices P and D is called diagonalization or eigendecomposition

Diagonalization procedure

  1. Find the eigenvalues of the matrix A by solving the characteristic equation det(A - λI) = 0
  2. For each distinct eigenvalue λ_i, find a basis for the corresponding eigenspace by solving (A - λ_iI)x = 0
  3. Construct the matrix P by arranging the eigenvectors (or their multiples) as columns
    • Ensure that the eigenvectors in P are linearly independent
    • If desired, normalize the eigenvectors to obtain an orthogonal matrix P
  4. Construct the diagonal matrix D by placing the eigenvalues λ_i on the main diagonal, in the same order as their corresponding eigenvectors in P
  5. Verify that A = PDP^(-1) (or A = PDP^T if P is orthogonal)

Repeated eigenvalues

  • If a matrix has repeated eigenvalues, it is diagonalizable if and only if it has a full set of linearly independent eigenvectors (i.e., the geometric multiplicity equals the algebraic multiplicity for each eigenvalue)
    • If the geometric multiplicity is less than the algebraic multiplicity for any eigenvalue, the matrix is not diagonalizable
  • When a matrix has repeated eigenvalues, the eigenspaces corresponding to these eigenvalues may intersect non-trivially
    • In this case, extra care must be taken to ensure that the eigenvectors chosen for P are linearly independent

Eigenvalues and eigenvectors for differential equations

First-order linear systems

  • A system of n linear first-order differential equations can be written in the form x' = Ax, where x is a vector of functions and A is an n x n matrix of constants
    • The vector x represents the state of the system, and the matrix A describes how the state evolves over time
  • If A is diagonalizable, the system can be solved using the eigenvalues and eigenvectors of A
    • The eigenvalues of A determine the stability and long-term behavior of the system
    • The eigenvectors of A determine the principal directions or modes of the system's evolution

Diagonalization method

  1. Diagonalize the matrix A to obtain A = PDP^(-1), where D is a diagonal matrix of eigenvalues and P is a matrix of corresponding eigenvectors
  2. Substitute A = PDP^(-1) into the original equation to get x' = PDP^(-1)x
  3. Define a new vector y = P^(-1)x, so that x = Py
    • The vector y represents the state of the system in the basis of eigenvectors
  4. Rewrite the system in terms of y to obtain y' = Dy, which is a system of n independent linear differential equations
    • Each equation in the system has the form y_i' = λ_i y_i, where λ_i is an eigenvalue from the diagonal of D
  5. Solve each equation in the system y' = Dy separately, using the eigenvalues from the diagonal of D
    • The solution to y_i' = λ_i y_i is y_i(t) = c_i e^(λ_i t), where c_i is a constant determined by initial conditions
  6. Reconstruct the solution to the original system by substituting the solutions for y back into x = Py
    • The final solution will be a linear combination of exponential functions, with coefficients determined by the initial conditions and the eigenvectors

Stability analysis

  • The stability of the system x' = Ax can be determined by examining the eigenvalues of A
    • If all eigenvalues have negative real parts, the system is asymptotically stable (solutions converge to zero as t → ∞)
    • If any eigenvalue has a positive real part, the system is unstable (solutions grow unbounded as t → ∞)
    • If all eigenvalues have non-positive real parts and at least one eigenvalue has a zero real part, the system is marginally stable (solutions remain bounded but may not converge to zero)
  • The eigenvectors corresponding to the eigenvalues with zero real parts determine the long-term behavior of the system
    • These eigenvectors form a basis for the center subspace, which contains all solutions that remain bounded but do not necessarily converge to zero
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary