Gram-Schmidt Orthogonalization
A method for orthonormalizing a set of vectors in an inner product space.
This public page keeps the free explanation visible and leaves premium worked solving, advanced walkthroughs, and saved study tools inside the app.
Core idea
Overview
The Gram-Schmidt process is a systematic method for generating an orthogonal or orthonormal basis from a set of linearly independent vectors in an inner product space. It works by iteratively subtracting the projections of a vector onto the previously constructed orthogonal vectors to ensure the new vector is perpendicular to all predecessors.
When to use: Apply this algorithm when you need to construct an orthogonal basis for a subspace, which is essential for simplifying vector projections and performing QR decompositions. It assumes that the input set of vectors is linearly independent and that an inner product (like the dot product) is defined.
Why it matters: Orthogonal bases are computationally efficient because they eliminate cross-term interactions in matrix operations. This process is vital in computer graphics for coordinate transformations, in signal processing for noise reduction, and in numerical analysis to improve the stability of least-squares solutions.
Symbols
Variables
= Resulting Orthogonal Magnitude, = Input Vector Magnitude, = Sum of Projections
Walkthrough
Derivation
Derivation/Understanding of Gram-Schmidt Orthogonalization
This derivation explains how to construct an orthogonal set of vectors from a given linearly independent set by successively subtracting projections.
- We are working in an inner product space (e.g., Euclidean space ^n with the dot product).
- The initial set of vectors \{, , , \} is linearly independent.
Initialize the first orthogonal vector:
To begin constructing an orthogonal set \{, , , \} from a given linearly independent set \{, , , \}, we simply choose the first vector to be equal to .
Orthogonalize the second vector:
To ensure is orthogonal to , we take and subtract its component that lies in the direction of . This component is precisely the projection of onto .
Generalize to the k-th vector:
Assuming we have already constructed an orthogonal set \{, , \}, to find , we start with and subtract its projection onto each of the previously orthogonalized vectors . This process removes all components of that lie in the span of \{, , \}.
Express using summation notation:
The sum of projections can be written compactly using summation notation. This formula defines such that it is orthogonal to all for , thus building an orthogonal set iteratively.
Result
Source: Lay, D. C., Lay, S. R., & McDonald, J. J. (2016). Linear Algebra and Its Applications (5th ed.). Pearson.
Visual intuition
Graph
Graph unavailable for this formula.
The graph typically displays a series of vectors in a 3D coordinate system where an original set of linearly independent vectors is transformed into a set of mutually orthogonal vectors. The transformation process maps the input basis onto a new Cartesian basis, represented geometrically as lines projecting onto one another to eliminate overlapping components. This illustrates the linear projection of vectors onto an orthogonal subspace, effectively rotating and scaling the vector space while preserving its span.
Graph type: linear
Why it behaves this way
Intuition
Visualize taking each new vector, projecting it onto all previously orthogonalized vectors, and then subtracting these projections to isolate the part of the new vector that is perfectly perpendicular to all the others.
Signs and relationships
- - \sum_{j=1}^{k-1} \text{proj}_{u_j}(v_k): The subtraction removes the components of that are parallel to the previously constructed orthogonal vectors , ensuring that the resulting is perpendicular to all of them.
Free study cues
Insight
Canonical usage
The Gram-Schmidt process operates on vectors, preserving their units. If input vectors represent physical quantities with units (e.g., meters, Newtons), the resulting orthogonal vectors will have those same units.
Common confusion
A common mistake is assuming the Gram-Schmidt process changes or removes the physical units of the vectors. It operates on the vector components, preserving their inherent dimensions.
Unit systems
One free problem
Practice Problem
In a linear algebra exercise, a student is processing the second vector in a set. If the input vector vk has a component value of 12 and the sum of its projections onto the first orthogonal vector (projSum) is calculated as 4.5, find the corresponding component of the resulting orthogonal vector result.
Solve for: result
Hint: Subtract the sum of the projections from the original vector component.
The full worked solution stays in the interactive walkthrough.
Where it shows up
Real-World Context
In qR decomposition to solve linear least squares problems and in signal processing to remove correlation, Gram-Schmidt Orthogonalization is used to calculate Resulting Magnitude from Input Vector Magnitude and Sum of Projections. The result matters because it helps estimate likelihood and make a risk or decision statement rather than treating the number as certainty.
Study smarter
Tips
- Always verify orthogonality at each step by checking if the dot product of the new vector and any previous vector is zero.
- Normalize each resulting vector immediately if an orthonormal basis is required.
- Process vectors in their original order to maintain the nested hierarchy of spanned subspaces.
Avoid these traps
Common Mistakes
- Using the original vectors instead of the newly found orthogonal vectors for subsequent projections.
- Calculation errors in the dot products used for scalar projections.
Common questions
Frequently Asked Questions
This derivation explains how to construct an orthogonal set of vectors from a given linearly independent set by successively subtracting projections.
Apply this algorithm when you need to construct an orthogonal basis for a subspace, which is essential for simplifying vector projections and performing QR decompositions. It assumes that the input set of vectors is linearly independent and that an inner product (like the dot product) is defined.
Orthogonal bases are computationally efficient because they eliminate cross-term interactions in matrix operations. This process is vital in computer graphics for coordinate transformations, in signal processing for noise reduction, and in numerical analysis to improve the stability of least-squares solutions.
Using the original vectors instead of the newly found orthogonal vectors for subsequent projections. Calculation errors in the dot products used for scalar projections.
In qR decomposition to solve linear least squares problems and in signal processing to remove correlation, Gram-Schmidt Orthogonalization is used to calculate Resulting Magnitude from Input Vector Magnitude and Sum of Projections. The result matters because it helps estimate likelihood and make a risk or decision statement rather than treating the number as certainty.
Always verify orthogonality at each step by checking if the dot product of the new vector and any previous vector is zero. Normalize each resulting vector immediately if an orthonormal basis is required. Process vectors in their original order to maintain the nested hierarchy of spanned subspaces.
References
Sources
- Linear Algebra and Its Applications (5th ed.) by David C. Lay, Steven R. Lay, and Judi J. McDonald
- Introduction to Linear Algebra (5th ed.) by Gilbert Strang
- Wikipedia: Gram-Schmidt process
- Linear Algebra and Its Applications by David C. Lay, 5th ed.
- Introduction to Linear Algebra by Gilbert Strang, 5th ed.
- Gram-Schmidt process (Wikipedia article title)
- Linear Algebra and Its Applications by David C. Lay (5th Edition)
- Numerical Linear Algebra by Lloyd N. Trefethen and David Bau III