➗Abstract Linear Algebra II Unit 4 – Inner Product Spaces
Inner product spaces blend vector spaces with inner products, generalizing the dot product. They introduce concepts like norms, angles, and orthogonality, providing a rich framework for geometric intuition in abstract spaces.
These spaces are crucial in linear algebra, quantum mechanics, and functional analysis. They enable powerful techniques like orthogonal projections, Gram-Schmidt process, and best approximations, forming the foundation for advanced mathematical and physical theories.
Inner product spaces combine vector spaces with an inner product, a generalization of the dot product
Inner products are denoted as ⟨u,v⟩ for vectors u and v
Must satisfy properties of conjugate symmetry, linearity in the second argument, and positive definiteness
Norm of a vector v is defined as ∥v∥=⟨v,v⟩, generalizing the concept of length
Cauchy-Schwarz inequality states that ∣⟨u,v⟩∣≤∥u∥∥v∥ for any vectors u and v
Equality holds if and only if u and v are linearly dependent
Angle between two vectors u and v can be defined using the inner product as cosθ=∥u∥∥v∥⟨u,v⟩
Orthogonality: two vectors u and v are orthogonal if ⟨u,v⟩=0
Orthonormal basis is a basis consisting of orthogonal unit vectors (vectors with norm 1)
Properties of Inner Product Spaces
Inner product spaces are normed vector spaces, with the norm induced by the inner product
Parallelogram law holds in inner product spaces: ∥u+v∥2+∥u−v∥2=2(∥u∥2+∥v∥2) for any vectors u and v
Polarization identity allows the inner product to be recovered from the norm: ⟨u,v⟩=41(∥u+v∥2−∥u−v∥2)
Orthogonal complements: for any subspace U of an inner product space V, there exists an orthogonal complement U⊥ such that V=U⊕U⊥
U⊥={v∈V:⟨u,v⟩=0 for all u∈U}
Pythagorean theorem generalizes to inner product spaces: if u1,…,un are pairwise orthogonal, then ∥∑i=1nui∥2=∑i=1n∥ui∥2
Bessel's inequality states that for any orthonormal set {e1,…,en} and vector v, ∑i=1n∣⟨v,ei⟩∣2≤∥v∥2
Equality holds if and only if {e1,…,en} is an orthonormal basis
Examples and Non-Examples
Euclidean spaces Rn with the standard dot product are inner product spaces
For u=(u1,…,un) and v=(v1,…,vn), ⟨u,v⟩=∑i=1nuivi
Complex vector spaces Cn with the Hermitian inner product are inner product spaces
For u=(u1,…,un) and v=(v1,…,vn), ⟨u,v⟩=∑i=1nuivi
Space of continuous functions C[a,b] with the inner product ⟨f,g⟩=∫abf(x)g(x)dx is an inner product space
Space of square-integrable functions L2[a,b] with the inner product ⟨f,g⟩=∫abf(x)g(x)dx is an inner product space
Not all normed vector spaces are inner product spaces
ℓp spaces with p=2 are normed vector spaces but not inner product spaces
Not all bilinear forms on vector spaces are inner products
Bilinear form B(u,v)=u1v1−u2v2 on R2 is not an inner product (fails positive definiteness)
Orthogonality and Orthonormal Bases
Orthogonality is a generalization of perpendicularity in inner product spaces
Two vectors u and v are orthogonal if ⟨u,v⟩=0
Orthogonal sets are linearly independent
If {u1,…,un} is an orthogonal set, then it is linearly independent
Orthonormal sets are orthogonal sets consisting of unit vectors (vectors with norm 1)
Orthonormal bases are bases consisting of orthonormal vectors
Provide a convenient way to represent vectors and compute inner products
Parseval's identity: if {e1,…,en} is an orthonormal basis and v=∑i=1n⟨v,ei⟩ei, then ∥v∥2=∑i=1n∣⟨v,ei⟩∣2
Orthogonal projection of a vector v onto a subspace U is the closest point in U to v
Can be computed using an orthonormal basis {e1,…,en} of U as projU(v)=∑i=1n⟨v,ei⟩ei
Gram-Schmidt Process
Gram-Schmidt process is an algorithm for constructing an orthonormal basis from a linearly independent set
Given a linearly independent set {v1,…,vn}, the Gram-Schmidt process produces an orthonormal set {e1,…,en} spanning the same subspace
Algorithm:
Set e1=∥v1∥v1
For i=2,…,n:
Set ui=vi−∑j=1i−1⟨vi,ej⟩ej
Set ei=∥ui∥ui
Resulting set {e1,…,en} is an orthonormal basis for the span of {v1,…,vn}
Gram-Schmidt process is numerically unstable due to rounding errors
Modified Gram-Schmidt process and Householder transformations are more stable alternatives
Projections and Best Approximations
Orthogonal projection of a vector v onto a subspace U is the closest point in U to v
Denoted as projU(v) or PU(v)
Projection theorem states that for any vector v and subspace U, v=projU(v)+projU⊥(v)
Decomposition is unique and orthogonal
Best approximation problem: given a subspace U and a vector v, find the vector u∈U that minimizes ∥v−u∥
Solution is the orthogonal projection projU(v)
Least squares approximation: given a set of data points (xi,yi) and a subspace U of functions, find the function f∈U that minimizes ∑i=1n(yi−f(xi))2
Solution is the orthogonal projection of the data vector y=(y1,…,yn) onto the subspace {(f(x1),…,f(xn)):f∈U}
Projections onto orthogonal complements: for any subspace U and vector v, projU⊥(v)=v−projU(v)
Applications in Linear Algebra
Orthogonal diagonalization: symmetric matrices can be diagonalized by an orthonormal basis of eigenvectors
Eigenvalues are real and eigenvectors corresponding to distinct eigenvalues are orthogonal
Singular value decomposition (SVD): any matrix A can be written as A=UΣV∗, where U and V are orthogonal matrices and Σ is a diagonal matrix with non-negative entries
Columns of U and V are orthonormal bases for the left and right singular subspaces of A
Principal component analysis (PCA): technique for dimensionality reduction and data compression using orthogonal projections
Finds orthogonal directions (principal components) that maximize the variance of the projected data
Quantum mechanics: state vectors are elements of a complex inner product space (Hilbert space)
Observables are represented by Hermitian operators, and their eigenvectors form an orthonormal basis
Fourier analysis: functions can be represented as linear combinations of orthogonal basis functions (e.g., trigonometric functions, wavelets)
Coefficients are computed using inner products (Fourier coefficients)
Common Pitfalls and Misconceptions
Not all bilinear forms are inner products
Inner products must satisfy additional properties (conjugate symmetry, positive definiteness)
Orthogonality does not imply orthonormality
Orthonormal vectors are orthogonal and have unit norm
Orthogonal projection onto a subspace is not the same as the vector projection (scalar projection)
Vector projection computes the component of a vector along another vector
Orthonormal bases are not unique
Any orthonormal basis can be transformed into another by an orthogonal transformation (rotation or reflection)
Gram-Schmidt process is not the only way to construct orthonormal bases
Other methods include Householder transformations and Givens rotations
Orthogonal matrices are not always symmetric
Orthogonal matrices satisfy ATA=AAT=I, but may not be symmetric (AT=A)
Orthogonal projections do not commute in general
projU(projV(v))=projV(projU(v)) unless U and V are orthogonal subspaces