MathematicsLinear AlgebraUniversity
AQAAPOntarioNSWCBSEGCE O-LevelMoECAPS

Gram-Schmidt Orthogonalization

A method for orthonormalizing a set of vectors in an inner product space.

Understand the formulaSee the free derivationOpen the full walkthrough

This public page keeps the free explanation visible and leaves premium worked solving, advanced walkthroughs, and saved study tools inside the app.

Core idea

Overview

The Gram-Schmidt process is a systematic method for generating an orthogonal or orthonormal basis from a set of linearly independent vectors in an inner product space. It works by iteratively subtracting the projections of a vector onto the previously constructed orthogonal vectors to ensure the new vector is perpendicular to all predecessors.

When to use: Apply this algorithm when you need to construct an orthogonal basis for a subspace, which is essential for simplifying vector projections and performing QR decompositions. It assumes that the input set of vectors is linearly independent and that an inner product (like the dot product) is defined.

Why it matters: Orthogonal bases are computationally efficient because they eliminate cross-term interactions in matrix operations. This process is vital in computer graphics for coordinate transformations, in signal processing for noise reduction, and in numerical analysis to improve the stability of least-squares solutions.

Symbols

Variables

= Resulting Orthogonal Magnitude, = Input Vector Magnitude, = Sum of Projections

Resulting Orthogonal Magnitude
Variable
Input Vector Magnitude
Variable
Sum of Projections
Variable

Walkthrough

Derivation

Derivation/Understanding of Gram-Schmidt Orthogonalization

This derivation explains how to construct an orthogonal set of vectors from a given linearly independent set by successively subtracting projections.

  • We are working in an inner product space (e.g., Euclidean space ^n with the dot product).
  • The initial set of vectors \{, , , \} is linearly independent.
1

Initialize the first orthogonal vector:

To begin constructing an orthogonal set \{, , , \} from a given linearly independent set \{, , , \}, we simply choose the first vector to be equal to .

2

Orthogonalize the second vector:

To ensure is orthogonal to , we take and subtract its component that lies in the direction of . This component is precisely the projection of onto .

3

Generalize to the k-th vector:

Assuming we have already constructed an orthogonal set \{, , \}, to find , we start with and subtract its projection onto each of the previously orthogonalized vectors . This process removes all components of that lie in the span of \{, , \}.

4

Express using summation notation:

The sum of projections can be written compactly using summation notation. This formula defines such that it is orthogonal to all for , thus building an orthogonal set iteratively.

Result

Source: Lay, D. C., Lay, S. R., & McDonald, J. J. (2016). Linear Algebra and Its Applications (5th ed.). Pearson.

Visual intuition

Graph

Graph unavailable for this formula.

The graph typically displays a series of vectors in a 3D coordinate system where an original set of linearly independent vectors is transformed into a set of mutually orthogonal vectors. The transformation process maps the input basis onto a new Cartesian basis, represented geometrically as lines projecting onto one another to eliminate overlapping components. This illustrates the linear projection of vectors onto an orthogonal subspace, effectively rotating and scaling the vector space while preserving its span.

Graph type: linear

Why it behaves this way

Intuition

Visualize taking each new vector, projecting it onto all previously orthogonalized vectors, and then subtracting these projections to isolate the part of the new vector that is perfectly perpendicular to all the others.

The k-th vector in the newly constructed orthogonal set.
This is the 'cleaned' version of , made perpendicular to all previous vectors.
The k-th original input vector from the non-orthogonal set.
This is the vector currently being processed to make it orthogonal to the others.
The component of vector v_k that lies along the direction of the previously constructed orthogonal vector u_j.
This is the 'overlap' or 'shadow' of onto , representing the non-orthogonal part.
The sum of all components of v_k that are not orthogonal to the subspace spanned by u_1, ..., u_{k-1}.
This represents the total 'non-orthogonal part' of with respect to the already orthogonalized vectors.

Signs and relationships

  • - \sum_{j=1}^{k-1} \text{proj}_{u_j}(v_k): The subtraction removes the components of that are parallel to the previously constructed orthogonal vectors , ensuring that the resulting is perpendicular to all of them.

Free study cues

Insight

Canonical usage

The Gram-Schmidt process operates on vectors, preserving their units. If input vectors represent physical quantities with units (e.g., meters, Newtons), the resulting orthogonal vectors will have those same units.

Common confusion

A common mistake is assuming the Gram-Schmidt process changes or removes the physical units of the vectors. It operates on the vector components, preserving their inherent dimensions.

Unit systems

Consistent with the physical quantity represented by the vector - All input vectors `v_k` must have consistent dimensions. The resulting orthogonal vectors `u_k` will inherit these dimensions. For example, if `v_k` represents a displacement in meters, then `u_k` will also be in meters.
Consistent with the physical quantity represented by the vector - The output vector `u_k` will have the same dimensions as the input vector `v_k` from which it is derived, ensuring dimensional consistency throughout the orthogonalization process.

One free problem

Practice Problem

In a linear algebra exercise, a student is processing the second vector in a set. If the input vector vk has a component value of 12 and the sum of its projections onto the first orthogonal vector (projSum) is calculated as 4.5, find the corresponding component of the resulting orthogonal vector result.

Input Vector Magnitude12
Sum of Projections4.5

Solve for: result

Hint: Subtract the sum of the projections from the original vector component.

The full worked solution stays in the interactive walkthrough.

Where it shows up

Real-World Context

In qR decomposition to solve linear least squares problems and in signal processing to remove correlation, Gram-Schmidt Orthogonalization is used to calculate Resulting Magnitude from Input Vector Magnitude and Sum of Projections. The result matters because it helps estimate likelihood and make a risk or decision statement rather than treating the number as certainty.

Study smarter

Tips

  • Always verify orthogonality at each step by checking if the dot product of the new vector and any previous vector is zero.
  • Normalize each resulting vector immediately if an orthonormal basis is required.
  • Process vectors in their original order to maintain the nested hierarchy of spanned subspaces.

Avoid these traps

Common Mistakes

  • Using the original vectors instead of the newly found orthogonal vectors for subsequent projections.
  • Calculation errors in the dot products used for scalar projections.

Common questions

Frequently Asked Questions

This derivation explains how to construct an orthogonal set of vectors from a given linearly independent set by successively subtracting projections.

Apply this algorithm when you need to construct an orthogonal basis for a subspace, which is essential for simplifying vector projections and performing QR decompositions. It assumes that the input set of vectors is linearly independent and that an inner product (like the dot product) is defined.

Orthogonal bases are computationally efficient because they eliminate cross-term interactions in matrix operations. This process is vital in computer graphics for coordinate transformations, in signal processing for noise reduction, and in numerical analysis to improve the stability of least-squares solutions.

Using the original vectors instead of the newly found orthogonal vectors for subsequent projections. Calculation errors in the dot products used for scalar projections.

In qR decomposition to solve linear least squares problems and in signal processing to remove correlation, Gram-Schmidt Orthogonalization is used to calculate Resulting Magnitude from Input Vector Magnitude and Sum of Projections. The result matters because it helps estimate likelihood and make a risk or decision statement rather than treating the number as certainty.

Always verify orthogonality at each step by checking if the dot product of the new vector and any previous vector is zero. Normalize each resulting vector immediately if an orthonormal basis is required. Process vectors in their original order to maintain the nested hierarchy of spanned subspaces.

References

Sources

  1. Linear Algebra and Its Applications (5th ed.) by David C. Lay, Steven R. Lay, and Judi J. McDonald
  2. Introduction to Linear Algebra (5th ed.) by Gilbert Strang
  3. Wikipedia: Gram-Schmidt process
  4. Linear Algebra and Its Applications by David C. Lay, 5th ed.
  5. Introduction to Linear Algebra by Gilbert Strang, 5th ed.
  6. Gram-Schmidt process (Wikipedia article title)
  7. Linear Algebra and Its Applications by David C. Lay (5th Edition)
  8. Numerical Linear Algebra by Lloyd N. Trefethen and David Bau III