Abstract Linear Algebra II

Abstract Linear Algebra II Unit 4 – Inner Product Spaces

Inner product spaces blend vector spaces with inner products, generalizing the dot product. They introduce concepts like norms, angles, and orthogonality, providing a rich framework for geometric intuition in abstract spaces. These spaces are crucial in linear algebra, quantum mechanics, and functional analysis. They enable powerful techniques like orthogonal projections, Gram-Schmidt process, and best approximations, forming the foundation for advanced mathematical and physical theories.

Key Concepts and Definitions

  • Inner product spaces combine vector spaces with an inner product, a generalization of the dot product
  • Inner products are denoted as u,v\langle u, v \rangle for vectors uu and vv
    • Must satisfy properties of conjugate symmetry, linearity in the second argument, and positive definiteness
  • Norm of a vector vv is defined as v=v,v\|v\| = \sqrt{\langle v, v \rangle}, generalizing the concept of length
  • Cauchy-Schwarz inequality states that u,vuv|\langle u, v \rangle| \leq \|u\| \|v\| for any vectors uu and vv
    • Equality holds if and only if uu and vv are linearly dependent
  • Angle between two vectors uu and vv can be defined using the inner product as cosθ=u,vuv\cos \theta = \frac{\langle u, v \rangle}{\|u\| \|v\|}
  • Orthogonality: two vectors uu and vv are orthogonal if u,v=0\langle u, v \rangle = 0
  • Orthonormal basis is a basis consisting of orthogonal unit vectors (vectors with norm 1)

Properties of Inner Product Spaces

  • Inner product spaces are normed vector spaces, with the norm induced by the inner product
  • Parallelogram law holds in inner product spaces: u+v2+uv2=2(u2+v2)\|u + v\|^2 + \|u - v\|^2 = 2(\|u\|^2 + \|v\|^2) for any vectors uu and vv
  • Polarization identity allows the inner product to be recovered from the norm: u,v=14(u+v2uv2)\langle u, v \rangle = \frac{1}{4}(\|u + v\|^2 - \|u - v\|^2)
  • Orthogonal complements: for any subspace UU of an inner product space VV, there exists an orthogonal complement UU^\perp such that V=UUV = U \oplus U^\perp
    • U={vV:u,v=0 for all uU}U^\perp = \{v \in V : \langle u, v \rangle = 0 \text{ for all } u \in U\}
  • Pythagorean theorem generalizes to inner product spaces: if u1,,unu_1, \ldots, u_n are pairwise orthogonal, then i=1nui2=i=1nui2\|\sum_{i=1}^n u_i\|^2 = \sum_{i=1}^n \|u_i\|^2
  • Bessel's inequality states that for any orthonormal set {e1,,en}\{e_1, \ldots, e_n\} and vector vv, i=1nv,ei2v2\sum_{i=1}^n |\langle v, e_i \rangle|^2 \leq \|v\|^2
    • Equality holds if and only if {e1,,en}\{e_1, \ldots, e_n\} is an orthonormal basis

Examples and Non-Examples

  • Euclidean spaces Rn\mathbb{R}^n with the standard dot product are inner product spaces
    • For u=(u1,,un)u = (u_1, \ldots, u_n) and v=(v1,,vn)v = (v_1, \ldots, v_n), u,v=i=1nuivi\langle u, v \rangle = \sum_{i=1}^n u_i v_i
  • Complex vector spaces Cn\mathbb{C}^n with the Hermitian inner product are inner product spaces
    • For u=(u1,,un)u = (u_1, \ldots, u_n) and v=(v1,,vn)v = (v_1, \ldots, v_n), u,v=i=1nuivi\langle u, v \rangle = \sum_{i=1}^n \overline{u_i} v_i
  • Space of continuous functions C[a,b]C[a, b] with the inner product f,g=abf(x)g(x)dx\langle f, g \rangle = \int_a^b f(x) g(x) dx is an inner product space
  • Space of square-integrable functions L2[a,b]L^2[a, b] with the inner product f,g=abf(x)g(x)dx\langle f, g \rangle = \int_a^b f(x) \overline{g(x)} dx is an inner product space
  • Not all normed vector spaces are inner product spaces
    • p\ell^p spaces with p2p \neq 2 are normed vector spaces but not inner product spaces
  • Not all bilinear forms on vector spaces are inner products
    • Bilinear form B(u,v)=u1v1u2v2B(u, v) = u_1 v_1 - u_2 v_2 on R2\mathbb{R}^2 is not an inner product (fails positive definiteness)

Orthogonality and Orthonormal Bases

  • Orthogonality is a generalization of perpendicularity in inner product spaces
    • Two vectors uu and vv are orthogonal if u,v=0\langle u, v \rangle = 0
  • Orthogonal sets are linearly independent
    • If {u1,,un}\{u_1, \ldots, u_n\} is an orthogonal set, then it is linearly independent
  • Orthonormal sets are orthogonal sets consisting of unit vectors (vectors with norm 1)
  • Orthonormal bases are bases consisting of orthonormal vectors
    • Provide a convenient way to represent vectors and compute inner products
  • Parseval's identity: if {e1,,en}\{e_1, \ldots, e_n\} is an orthonormal basis and v=i=1nv,eieiv = \sum_{i=1}^n \langle v, e_i \rangle e_i, then v2=i=1nv,ei2\|v\|^2 = \sum_{i=1}^n |\langle v, e_i \rangle|^2
  • Orthogonal projection of a vector vv onto a subspace UU is the closest point in UU to vv
    • Can be computed using an orthonormal basis {e1,,en}\{e_1, \ldots, e_n\} of UU as projU(v)=i=1nv,eiei\text{proj}_U(v) = \sum_{i=1}^n \langle v, e_i \rangle e_i

Gram-Schmidt Process

  • Gram-Schmidt process is an algorithm for constructing an orthonormal basis from a linearly independent set
  • Given a linearly independent set {v1,,vn}\{v_1, \ldots, v_n\}, the Gram-Schmidt process produces an orthonormal set {e1,,en}\{e_1, \ldots, e_n\} spanning the same subspace
  • Algorithm:
    1. Set e1=v1v1e_1 = \frac{v_1}{\|v_1\|}
    2. For i=2,,ni = 2, \ldots, n:
      • Set ui=vij=1i1vi,ejeju_i = v_i - \sum_{j=1}^{i-1} \langle v_i, e_j \rangle e_j
      • Set ei=uiuie_i = \frac{u_i}{\|u_i\|}
  • Resulting set {e1,,en}\{e_1, \ldots, e_n\} is an orthonormal basis for the span of {v1,,vn}\{v_1, \ldots, v_n\}
  • Gram-Schmidt process is numerically unstable due to rounding errors
    • Modified Gram-Schmidt process and Householder transformations are more stable alternatives

Projections and Best Approximations

  • Orthogonal projection of a vector vv onto a subspace UU is the closest point in UU to vv
    • Denoted as projU(v)\text{proj}_U(v) or PU(v)P_U(v)
  • Projection theorem states that for any vector vv and subspace UU, v=projU(v)+projU(v)v = \text{proj}_U(v) + \text{proj}_{U^\perp}(v)
    • Decomposition is unique and orthogonal
  • Best approximation problem: given a subspace UU and a vector vv, find the vector uUu \in U that minimizes vu\|v - u\|
    • Solution is the orthogonal projection projU(v)\text{proj}_U(v)
  • Least squares approximation: given a set of data points (xi,yi)(x_i, y_i) and a subspace UU of functions, find the function fUf \in U that minimizes i=1n(yif(xi))2\sum_{i=1}^n (y_i - f(x_i))^2
    • Solution is the orthogonal projection of the data vector y=(y1,,yn)y = (y_1, \ldots, y_n) onto the subspace {(f(x1),,f(xn)):fU}\{(f(x_1), \ldots, f(x_n)) : f \in U\}
  • Projections onto orthogonal complements: for any subspace UU and vector vv, projU(v)=vprojU(v)\text{proj}_{U^\perp}(v) = v - \text{proj}_U(v)

Applications in Linear Algebra

  • Orthogonal diagonalization: symmetric matrices can be diagonalized by an orthonormal basis of eigenvectors
    • Eigenvalues are real and eigenvectors corresponding to distinct eigenvalues are orthogonal
  • Singular value decomposition (SVD): any matrix AA can be written as A=UΣVA = U \Sigma V^*, where UU and VV are orthogonal matrices and Σ\Sigma is a diagonal matrix with non-negative entries
    • Columns of UU and VV are orthonormal bases for the left and right singular subspaces of AA
  • Principal component analysis (PCA): technique for dimensionality reduction and data compression using orthogonal projections
    • Finds orthogonal directions (principal components) that maximize the variance of the projected data
  • Quantum mechanics: state vectors are elements of a complex inner product space (Hilbert space)
    • Observables are represented by Hermitian operators, and their eigenvectors form an orthonormal basis
  • Fourier analysis: functions can be represented as linear combinations of orthogonal basis functions (e.g., trigonometric functions, wavelets)
    • Coefficients are computed using inner products (Fourier coefficients)

Common Pitfalls and Misconceptions

  • Not all bilinear forms are inner products
    • Inner products must satisfy additional properties (conjugate symmetry, positive definiteness)
  • Orthogonality does not imply orthonormality
    • Orthonormal vectors are orthogonal and have unit norm
  • Orthogonal projection onto a subspace is not the same as the vector projection (scalar projection)
    • Vector projection computes the component of a vector along another vector
  • Orthonormal bases are not unique
    • Any orthonormal basis can be transformed into another by an orthogonal transformation (rotation or reflection)
  • Gram-Schmidt process is not the only way to construct orthonormal bases
    • Other methods include Householder transformations and Givens rotations
  • Orthogonal matrices are not always symmetric
    • Orthogonal matrices satisfy ATA=AAT=IA^T A = A A^T = I, but may not be symmetric (ATAA^T \neq A)
  • Orthogonal projections do not commute in general
    • projU(projV(v))projV(projU(v))\text{proj}_U(\text{proj}_V(v)) \neq \text{proj}_V(\text{proj}_U(v)) unless UU and VV are orthogonal subspaces


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.