You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

Self-adjoint operators are linear operators equal to their adjoint, with real eigenvalues and orthogonal eigenspaces. They're crucial in and data analysis. Their properties make them ideal for representing physical observables and analyzing complex datasets.

Hermitian matrices are the matrix representation of self-adjoint operators in finite-dimensional spaces. They share similar properties, including real eigenvalues and . The allows for diagonalization, enabling efficient computation of matrix functions and applications in various fields.

Self-adjoint operators in inner product spaces

Definition and properties

Top images from around the web for Definition and properties
Top images from around the web for Definition and properties
  • A is a linear operator equal to its adjoint operator
    • For a linear operator TT on an VV, TT is self-adjoint if Tx,y=x,Ty\langle Tx, y \rangle = \langle x, Ty \rangle for all x,yVx, y \in V
  • Self-adjoint operators are bounded and have real eigenvalues
    • The eigenspaces corresponding to distinct eigenvalues are orthogonal
  • If TT is a self-adjoint operator on a finite-dimensional inner product space VV, there exists an orthonormal basis for VV consisting of eigenvectors of TT

Algebraic properties

  • The set of self-adjoint operators on an inner product space forms a real vector space under the usual addition and scalar multiplication of operators
  • The composition of two self-adjoint operators is self-adjoint if and only if the operators commute
    • For self-adjoint operators SS and TT, ST=TSST = TS is a necessary and sufficient condition for STST to be self-adjoint
  • The sum of two self-adjoint operators is always self-adjoint
    • If SS and TT are self-adjoint, then S+TS + T is also self-adjoint
  • Scalar multiples of self-adjoint operators are self-adjoint
    • If TT is self-adjoint and cRc \in \mathbb{R}, then cTcT is also self-adjoint

Eigenvalues and eigenvectors of self-adjoint operators

Eigenvalue properties

  • Eigenvalues of a self-adjoint operator are always real
    • If λ\lambda is an eigenvalue of a self-adjoint operator TT, then λR\lambda \in \mathbb{R}
  • Eigenvectors corresponding to distinct eigenvalues of a self-adjoint operator are orthogonal
    • If v1v_1 and v2v_2 are eigenvectors of a self-adjoint operator TT with distinct eigenvalues λ1\lambda_1 and λ2\lambda_2, then v1,v2=0\langle v_1, v_2 \rangle = 0
  • The algebraic and geometric multiplicities of each eigenvalue of a self-adjoint operator are equal
    • For any eigenvalue λ\lambda of a self-adjoint operator TT, the dimension of the eigenspace corresponding to λ\lambda equals the multiplicity of λ\lambda as a root of the characteristic polynomial of TT

Spectral properties

  • A self-adjoint operator on a finite-dimensional inner product space has a complete set of orthonormal eigenvectors that form a basis for the space
    • This set of eigenvectors is called an orthonormal eigenbasis
  • Any vector in the inner product space can be expressed as a linear combination of the orthonormal eigenvectors
    • For a vector vv in an inner product space VV with orthonormal eigenbasis {u1,u2,,un}\{u_1, u_2, \ldots, u_n\}, v=i=1nv,uiuiv = \sum_{i=1}^n \langle v, u_i \rangle u_i
  • The eigenvalues of a self-adjoint operator can be used to calculate the operator's trace and determinant
    • For a self-adjoint operator TT with eigenvalues λ1,λ2,,λn\lambda_1, \lambda_2, \ldots, \lambda_n, tr(T)=i=1nλi\text{tr}(T) = \sum_{i=1}^n \lambda_i and det(T)=i=1nλi\det(T) = \prod_{i=1}^n \lambda_i

Self-adjoint operators vs Hermitian matrices

Hermitian matrices

  • A matrix AA is Hermitian if A=AA = A^*, where AA^* denotes the conjugate transpose of AA
    • Hermitian matrices are the matrix representation of self-adjoint operators on finite-dimensional inner product spaces
  • The eigenvalues of a are always real, and the eigenvectors corresponding to distinct eigenvalues are orthogonal
  • Every Hermitian matrix is unitarily diagonalizable
    • There exists a unitary matrix UU such that UAUU^*AU is a diagonal matrix with the eigenvalues of AA on the diagonal

Algebraic properties

  • The set of Hermitian matrices forms a real vector space under the usual matrix addition and scalar multiplication
  • The product of two Hermitian matrices is Hermitian if and only if the matrices commute
    • For Hermitian matrices AA and BB, AB=BAAB = BA is a necessary and sufficient condition for ABAB to be Hermitian
  • The sum of two Hermitian matrices is always Hermitian
    • If AA and BB are Hermitian, then A+BA + B is also Hermitian
  • Scalar multiples of Hermitian matrices are Hermitian
    • If AA is Hermitian and cRc \in \mathbb{R}, then cAcA is also Hermitian

Spectral theorem for self-adjoint operators

Diagonalization of Hermitian matrices

  • The spectral theorem states that if TT is a self-adjoint operator on a finite-dimensional inner product space VV, then there exists an orthonormal basis for VV consisting of eigenvectors of TT, and TT can be represented as a diagonal matrix with respect to this basis
  • To diagonalize a Hermitian matrix AA, find an orthonormal basis of eigenvectors {v1,v2,,vn}\{v_1, v_2, \ldots, v_n\} and form a unitary matrix UU with these eigenvectors as columns
    • Then, UAU=DU^*AU = D, where DD is a diagonal matrix with the eigenvalues of AA on the diagonal
  • The spectral decomposition of a Hermitian matrix AA is given by A=UDUA = UDU^*, where UU is a unitary matrix whose columns are eigenvectors of AA, and DD is a diagonal matrix with the eigenvalues of AA on the diagonal

Applications of the spectral theorem

  • The spectral theorem allows for the computation of matrix functions of Hermitian matrices
    • If ff is a function defined on the eigenvalues of a Hermitian matrix AA, then f(A)=Uf(D)Uf(A) = Uf(D)U^*, where f(D)f(D) is the diagonal matrix obtained by applying ff to each diagonal entry of DD
  • The spectral theorem is used in quantum mechanics to represent observables as self-adjoint operators and to calculate their expectation values and probabilities
    • The eigenvalues of the observable correspond to the possible measurement outcomes, and the eigenvectors represent the states in which the system is found after the measurement
  • The spectral theorem is also applied in signal processing and data analysis to perform principal component analysis (PCA) and singular value decomposition (SVD)
    • These techniques help in dimensionality reduction, feature extraction, and noise reduction by identifying the most significant eigenvectors and eigenvalues of the data covariance matrix
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary