You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

Inner product spaces let us measure angles and distances between vectors. This section dives into orthogonal complements - subspaces containing vectors perpendicular to a given subspace. We'll see how these complements relate to the original space's structure.

We'll also explore orthogonal projections, which find the closest vector in a subspace to a given vector. This concept is crucial for solving least-squares problems and into orthogonal components.

Orthogonal Complements and Properties

Definition and Basic Properties

Top images from around the web for Definition and Basic Properties
Top images from around the web for Definition and Basic Properties
  • of subspace W in V denoted W⊥ contains all vectors in V orthogonal to every vector in W
  • For finite-dimensional inner product space V and subspace W, (W⊥)⊥ = W
  • Dimension relationship dim(W) + dim(W⊥) = dim(V) holds for finite-dimensional inner product spaces
  • Orthogonal complement of a subspace forms a subspace of the inner product space
  • Zero subspace {0} has orthogonal complement V, and V has orthogonal complement {0}

Properties Involving Multiple Subspaces

  • For subspaces U and W in V, (U + W)⊥ = U⊥ ∩ W⊥
  • Intersection and sum relationship (U ∩ W)⊥ = U⊥ + W⊥ holds for subspaces U and W
  • Linear transformation T : V → W between inner product spaces yields ker(T*) = (range(T))⊥
  • Adjoint T* of linear transformation T satisfies range(T*) = (ker(T))⊥

Orthogonal Projections onto Subspaces

Computation and Properties

  • of vector v onto subspace W represents closest vector in W to v
  • Formula for orthogonal projection onto W with orthonormal basis {u₁, ..., uₖ} projW(v)=i=1kv,uiuiproj_W(v) = \sum_{i=1}^k \langle v, u_i \rangle u_i
  • Projection matrix for orthogonal projection onto W P=U(UU)1UP = U(U^*U)^{-1}U^* where U forms basis for W
  • Orthogonal projection P onto W satisfies P² = P (idempotent) and P* = P (self-adjoint)
  • Orthogonal projection onto W⊥ calculated as v - proj_W(v)
  • Error vector e = v - proj_W(v) orthogonal to all vectors in W

Orthogonal Decomposition

  • Vector v in V decomposes as v = proj_W(v) + proj_W⊥(v)
  • Decomposition represents v as sum of components in W and W⊥
  • Orthogonal decomposition unique for each vector v in V

Orthogonal Decomposition Theorem

Statement and Proof

  • Theorem states every vector v in V uniquely expressed as v = w + w⊥, w in W and w⊥ in W⊥
  • Existence proof constructs w = proj_W(v) and w⊥ = v - proj_W(v)
  • Uniqueness proof assumes two decompositions v = w₁ + w₁⊥ = w₂ + w₂⊥ and shows w₁ = w₂ and w₁⊥ = w₂⊥
  • Proof utilizes inner product properties and orthogonal complement definition

Implications and Applications

  • Theorem implies V = W ⊕ W⊥ () for any subspace W of V
  • Unique decomposition of vectors into projections onto W and W⊥
  • Fundamental theorem in understanding inner product space structure
  • Applications in linear algebra, functional analysis, and quantum mechanics (state vector decomposition)

Least-Squares Problems with Projections

Problem Formulation and Solution

  • Least-squares problem minimizes ||Ax - b||² for matrix A and vector b
  • Normal equations AAx = Ab characterize least-squares solution
  • Solution x̂ given by x^=(AA)1Ab\hat{x} = (A^*A)^{-1}A^*b when A*A invertible
  • Geometrically, solution finds closest vector Ax̂ in column space of A to b
  • Orthogonal projection of b onto column space of A projcol(A)(b)=A(AA)1Abproj_{col(A)}(b) = A(A^*A)^{-1}A^*b
  • Residual vector r = b - Ax̂ orthogonal to column space of A

Applications and Significance

  • Data fitting applications (linear regression, polynomial fitting)
  • Model matrix A represents predictor variables, b represents observed data
  • Signal processing uses (noise reduction, signal approximation)
  • Parameter estimation in scientific and engineering fields (system identification)
  • Statistical analysis applications (ANOVA, multiple regression)
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary