Projection
✖
Projection
Details

- For ordinary real vectors u and v, the projection is taken to be
.
- For ordinary complex vectors u and v, the projection is taken to be
, where
is Conjugate[v]. »
- In Projection[u,v,f], u and v can be any expressions or lists of expressions for which the inner product function f applied to pairs yields real results. »
- Projection[u,v,Dot] effectively assumes that all elements of u and v are real. »
Examples
open allclose allBasic Examples (3)Summary of the most common use cases
Scope (9)Survey of the scope of standard use cases
Basic Uses (6)
Find the projection of a machine-precision vector onto another:

https://wolfram.com/xid/0b0ks2yfpj6-ezxf1l

Projection of a complex vector onto another:

https://wolfram.com/xid/0b0ks2yfpj6-oi5epe

Projection of an exact vector onto another:

https://wolfram.com/xid/0b0ks2yfpj6-06hnjy

Projection of an arbitrary-precision vector onto another:

https://wolfram.com/xid/0b0ks2yfpj6-w4x9y5

The projection of large numerical vectors is computed efficiently:

https://wolfram.com/xid/0b0ks2yfpj6-u2c5m3

https://wolfram.com/xid/0b0ks2yfpj6-7e66a1

https://wolfram.com/xid/0b0ks2yfpj6-hfho4g


https://wolfram.com/xid/0b0ks2yfpj6-8azc46

General Inner Products (3)
Give an inner product of Dot to assume all expressions are real-valued:

https://wolfram.com/xid/0b0ks2yfpj6-kp98e0

Project vectors that are not lists using an explicit inner product:

https://wolfram.com/xid/0b0ks2yfpj6-kfubw3

https://wolfram.com/xid/0b0ks2yfpj6-mcnnv

Specify the inner product using a pure function:

https://wolfram.com/xid/0b0ks2yfpj6-paqjcx

Applications (18)Sample problems that can be solved with this function
Geometry (4)
Project the vector on the line spanned by the vector
:

https://wolfram.com/xid/0b0ks2yfpj6-1rf22h

Visualize and its projection onto the line spanned by
:

https://wolfram.com/xid/0b0ks2yfpj6-bb5qc5

Project the vector on the plane spanned by the vectors
and
:

https://wolfram.com/xid/0b0ks2yfpj6-t5aqq0
First, replace with a vector in the plane perpendicular to
:

https://wolfram.com/xid/0b0ks2yfpj6-556932

The projection in the plane is the sum of the projections onto and
:

https://wolfram.com/xid/0b0ks2yfpj6-6sqj73

Find the component perpendicular to the plane:

https://wolfram.com/xid/0b0ks2yfpj6-vy5nk0

Confirm the result by projecting onto the normal to the plane:

https://wolfram.com/xid/0b0ks2yfpj6-usowgy

Visualize the plane, the vector and its parallel and perpendicular components:

https://wolfram.com/xid/0b0ks2yfpj6-l0ji89

Use Projection to reflect the vector with respect to the line normal to the vector
:

https://wolfram.com/xid/0b0ks2yfpj6-yj6et

Since is perpendicular to the line, subtracting twice
will reflect
across the line:

https://wolfram.com/xid/0b0ks2yfpj6-bq9xuc

Compare with the result of ReflectionTransform:

https://wolfram.com/xid/0b0ks2yfpj6-b0k14

Visualize and its reflection as a twice-repeated translation by
:

https://wolfram.com/xid/0b0ks2yfpj6-h7zekp

The Frenet–Serret system encodes every space curve's properties in a vector basis and scalar functions. Consider the following curve (a helix):

https://wolfram.com/xid/0b0ks2yfpj6-rw3aai
Construct an orthonormal basis from the first three derivatives by subtracting parallel projections:

https://wolfram.com/xid/0b0ks2yfpj6-bmpr4j

Ensure that the basis is right-handed:

https://wolfram.com/xid/0b0ks2yfpj6-dly7ni
Compute the curvature, , and torsion,
, which quantify how the curve bends:

https://wolfram.com/xid/0b0ks2yfpj6-mufwwo

Verify the answers using FrenetSerretSystem:

https://wolfram.com/xid/0b0ks2yfpj6-k6so6w

Visualize the curve and the associated moving basis, also called a frame:

https://wolfram.com/xid/0b0ks2yfpj6-jzj4v4

Bases and Matrix Decompositions (3)
Apply the Gram–Schmidt process to construct an orthonormal basis from the following vectors:

https://wolfram.com/xid/0b0ks2yfpj6-ipdyf1
The first vector in the orthonormal basis, , is merely the normalized multiple
:

https://wolfram.com/xid/0b0ks2yfpj6-trym6a

For subsequent vectors, components parallel to earlier basis vectors are subtracted prior to normalization:

https://wolfram.com/xid/0b0ks2yfpj6-ez7k6a


https://wolfram.com/xid/0b0ks2yfpj6-4zt373


https://wolfram.com/xid/0b0ks2yfpj6-4wc2on

Confirm the answers using Orthogonalize:

https://wolfram.com/xid/0b0ks2yfpj6-s42jvt

Find an orthonormal basis for the column space of the following matrix , and then use that basis to find a QR factorization of
:

https://wolfram.com/xid/0b0ks2yfpj6-pnviy4

https://wolfram.com/xid/0b0ks2yfpj6-bqnxxw
Define as the
element of the corresponding Gram–Schmidt basis:

https://wolfram.com/xid/0b0ks2yfpj6-ltwu9n
Define as the matrix whose columns are
:

https://wolfram.com/xid/0b0ks2yfpj6-l00a8l


https://wolfram.com/xid/0b0ks2yfpj6-wtfzua


https://wolfram.com/xid/0b0ks2yfpj6-nem31

Compare with the result given by QRDecomposition; the matrices are the same:

https://wolfram.com/xid/0b0ks2yfpj6-1mmu3b

The matrices differ by a transposition because QRDecomposition gives the row-orthonormal result:

https://wolfram.com/xid/0b0ks2yfpj6-ou3qhi

For a Hermitian matrix (more generally, any normal matrix), the eigenvectors are orthogonal, and it is conventional to define the projection matrices , where
is a normalized eigenvector. Show that the action of the projection matrices on a general vector is the same as projecting the vector onto the eigenspace for the following matrix
:

https://wolfram.com/xid/0b0ks2yfpj6-byga6s


https://wolfram.com/xid/0b0ks2yfpj6-7z4qee

Find the eigenvalues and eigenvectors:

https://wolfram.com/xid/0b0ks2yfpj6-x0om78

Compute the normalized eigenvectors:

https://wolfram.com/xid/0b0ks2yfpj6-71n6kf

Compute the projection matrices:

https://wolfram.com/xid/0b0ks2yfpj6-fa0xbc

Confirm that multiplying a general vector by equals the projection of the vector onto
:

https://wolfram.com/xid/0b0ks2yfpj6-32m80j

Since the form an orthonormal basis, the sum of the
must be the identity matrix:

https://wolfram.com/xid/0b0ks2yfpj6-bkruql

Moreover, the sum of is the original matrix
:

https://wolfram.com/xid/0b0ks2yfpj6-2ay5bk

Least Squares and Curve Fitting (3)
If the linear system has no solution, the best approximate solution is the least-squares solution. That is the solution to
, where
is the orthogonal projection of
onto the column space of
. Consider the following
and
:

https://wolfram.com/xid/0b0ks2yfpj6-u38dy5
The linear system is inconsistent:

https://wolfram.com/xid/0b0ks2yfpj6-6tl0df


Find orthogonal vectors that span . First, let
be the first column of
:

https://wolfram.com/xid/0b0ks2yfpj6-qri3vi

Let be a vector in the column space that is perpendicular to
:

https://wolfram.com/xid/0b0ks2yfpj6-7ehbvr

Compute the orthogonal projection of
onto the spaced spanned by the
:

https://wolfram.com/xid/0b0ks2yfpj6-ba4m4k

Visualize , its projections
onto the
and
:

https://wolfram.com/xid/0b0ks2yfpj6-jtf0rc


https://wolfram.com/xid/0b0ks2yfpj6-oc1jie

Confirm the result using LeastSquares:

https://wolfram.com/xid/0b0ks2yfpj6-qxnruu

Projection can be used to find a best-fit curve to data. Consider the following data:

https://wolfram.com/xid/0b0ks2yfpj6-v42zji

Extract the and
coordinates from the data:

https://wolfram.com/xid/0b0ks2yfpj6-i4v6jo
Let have the columns
and
, so that minimizing
will be fitting to a line
:

https://wolfram.com/xid/0b0ks2yfpj6-pkve26
The following two orthogonal vectors clearly span the same spaces as the column of :

https://wolfram.com/xid/0b0ks2yfpj6-7p26fs

Get the coefficients and
for a linear least‐squares fit:

https://wolfram.com/xid/0b0ks2yfpj6-zl66v6

Verify the coefficients using Fit:

https://wolfram.com/xid/0b0ks2yfpj6-480xbe

Plot the best-fit curve along with the data:

https://wolfram.com/xid/0b0ks2yfpj6-wzqna8

Find the best-fit parabola to the following data:

https://wolfram.com/xid/0b0ks2yfpj6-x7pawr

Extract the and
coordinates from the data:

https://wolfram.com/xid/0b0ks2yfpj6-w5dwto
Let have the columns
,
and
, so that minimizing
will be fitting to
:

https://wolfram.com/xid/0b0ks2yfpj6-bmqc8e
Construct orthonormal vectors that have the same column space as
:

https://wolfram.com/xid/0b0ks2yfpj6-pdqhns
Get the coefficients ,
and
for a least‐squares fit:

https://wolfram.com/xid/0b0ks2yfpj6-z5bw8q

Verify the coefficients using Fit:

https://wolfram.com/xid/0b0ks2yfpj6-pp5xfy

Plot the best-fit curve along with the data:

https://wolfram.com/xid/0b0ks2yfpj6-04d0qo

General Inner Products and Function Spaces (5)
A positive-definite, real symmetric matrix or metric defines an inner product by
:

https://wolfram.com/xid/0b0ks2yfpj6-xs2xkr


https://wolfram.com/xid/0b0ks2yfpj6-i29f9p
Being positive-definite means that the associated quadratic form is positive for
:

https://wolfram.com/xid/0b0ks2yfpj6-f5yka3

Note that Dot itself is the inner product associated with the identity matrix:

https://wolfram.com/xid/0b0ks2yfpj6-vw5ltx

Apply the Gram–Schmidt process to the standard basis to obtain an orthonormal basis:

https://wolfram.com/xid/0b0ks2yfpj6-9jtyba

Confirm that this basis is orthonormal with respect to the inner product :

https://wolfram.com/xid/0b0ks2yfpj6-mkuz95

Fourier series are projections onto a particular basis in the inner product spaces . Define the standard inner product on square-integrable functions:

https://wolfram.com/xid/0b0ks2yfpj6-m3ag2t
Let denote
for different integer values of
:

https://wolfram.com/xid/0b0ks2yfpj6-0n2n8v

The are orthogonal to each other, though not orthonormal:

https://wolfram.com/xid/0b0ks2yfpj6-n9fsir

The Fourier series of a function is the projection of
onto the space spanned by the
:

https://wolfram.com/xid/0b0ks2yfpj6-v8809a

https://wolfram.com/xid/0b0ks2yfpj6-q1bjsz

Confirm the result using FourierSeries:

https://wolfram.com/xid/0b0ks2yfpj6-dq5j3c

Moreover, equals the Fourier coefficient corresponding to FourierParameters{-1,1}:

https://wolfram.com/xid/0b0ks2yfpj6-21rv87

Confirm using FourierCoefficient:

https://wolfram.com/xid/0b0ks2yfpj6-5qfjt2

Unnormalized Gram–Schmidt algorithm:

https://wolfram.com/xid/0b0ks2yfpj6-erx85
Do Gram–Schmidt on a random set of 3 vectors:

https://wolfram.com/xid/0b0ks2yfpj6-ex0u6

Verify orthogonality; as the vectors are not normalized, the result is a general diagonal matrix:

https://wolfram.com/xid/0b0ks2yfpj6-zb7hp

Use a positive and real symmetric matrix to define a complex inner product:

https://wolfram.com/xid/0b0ks2yfpj6-o2jfxd


https://wolfram.com/xid/0b0ks2yfpj6-ouvlbc
Do Gram–Schmidt on a random set of three complex vectors:

https://wolfram.com/xid/0b0ks2yfpj6-n7xyh5


https://wolfram.com/xid/0b0ks2yfpj6-s3vel7

LegendreP defines a family of orthogonal polynomials with respect to the inner product . Apply the unnormalized Gram–Schmidt process to the monomials
for
from zero through four to compute scalar multiples of the first five Legendre polynomials:

https://wolfram.com/xid/0b0ks2yfpj6-kbi0j8

Compare to the conventional Legendre polynomials:

https://wolfram.com/xid/0b0ks2yfpj6-8w9rq9

For each ,
and
differ by a constant multiple, which can be shown to equal
:

https://wolfram.com/xid/0b0ks2yfpj6-kjvah0


https://wolfram.com/xid/0b0ks2yfpj6-l72txw

Compare with an explicit expression for the orthonormalized polynomials:

https://wolfram.com/xid/0b0ks2yfpj6-b02j3e

HermiteH defines a family of orthogonal polynomials with respect to the inner product . Apply the unnormalized Gram–Schmidt process to the monomials
for
from zero through four to compute scalar multiples of the first four Hermite polynomials:

https://wolfram.com/xid/0b0ks2yfpj6-kwiwyi

Compared to the conventional Hermite polynomials, is smaller by a factor of
:

https://wolfram.com/xid/0b0ks2yfpj6-guyez7

The orthonormal polynomials differ by a multiple of in the denominator:

https://wolfram.com/xid/0b0ks2yfpj6-b7411l

Compare with an explicit expression for the orthonormalized polynomials:

https://wolfram.com/xid/0b0ks2yfpj6-dxjc6b

Quantum Mechanics (3)
In quantum mechanics, states are represented by complex unit vectors and physical quantities by Hermitian linear operators. The eigenvalues represent possible observations and the squared norm of projections onto the eigenvectors the probabilities of those observations. For the spin operator and state
given, find the possible observations and their probabilities:

https://wolfram.com/xid/0b0ks2yfpj6-g2odzx
Computing the eigensystem, the possible observations are :

https://wolfram.com/xid/0b0ks2yfpj6-evxncl

The relative probabilities are for
and
for
:

https://wolfram.com/xid/0b0ks2yfpj6-nabjdf

In quantum mechanics, the energy operator is called the Hamiltonian , and a state with energy
evolves according to the Schrödinger equation
. Given the Hamiltonian for a spin-1 particle in a constant magnetic field in the
direction, find the state at time
of a particle that is initially in the state
representing
:

https://wolfram.com/xid/0b0ks2yfpj6-f62azb
Computing the eigensystem, the energy levels are and
:

https://wolfram.com/xid/0b0ks2yfpj6-u0xcpc

The state at time is the sum of each eigenstate evolving according to the Schrödinger equation:

https://wolfram.com/xid/0b0ks2yfpj6-21l38w

For the Hamiltonian , the
eigenvector is a function that is a constant multiple of
, and the inner product on vectors is
. For a particle in the state
, find the probability that it is in one of the first four eigenstates. First, define an inner product:

https://wolfram.com/xid/0b0ks2yfpj6-gearsp
Confirm that is a unit vector in this inner product:

https://wolfram.com/xid/0b0ks2yfpj6-t4kixa

Project onto the first four states; for and
, the projection and hence probability is zero:

https://wolfram.com/xid/0b0ks2yfpj6-0dwaj0

The probability is given by the squared norm of the projection. For , it is just under 90%:

https://wolfram.com/xid/0b0ks2yfpj6-l8wtgc


https://wolfram.com/xid/0b0ks2yfpj6-o0gdfu


https://wolfram.com/xid/0b0ks2yfpj6-llh39l


https://wolfram.com/xid/0b0ks2yfpj6-mm03wi

Properties & Relations (8)Properties of the function, and connections to other functions
The projection of u onto v is in the direction of v:

https://wolfram.com/xid/0b0ks2yfpj6-g9azm5

https://wolfram.com/xid/0b0ks2yfpj6-c2e84y

The projection of v onto itself is v:

https://wolfram.com/xid/0b0ks2yfpj6-rz4b6o

https://wolfram.com/xid/0b0ks2yfpj6-u3hj8a

For ordinary vectors and
, the projection is taken to be
:

https://wolfram.com/xid/0b0ks2yfpj6-go0fd1

https://wolfram.com/xid/0b0ks2yfpj6-2acz4l


https://wolfram.com/xid/0b0ks2yfpj6-cx4xr0

If and
have real entries,
, where
is the angle between
and
:

https://wolfram.com/xid/0b0ks2yfpj6-9bbyeo

For vectors u and v, u-Projection[u,v] is orthogonal to v:

https://wolfram.com/xid/0b0ks2yfpj6-1anomc

Orthogonalize can be implemented by repeated application of Projection and Normalize:

https://wolfram.com/xid/0b0ks2yfpj6-ilp8x

For ordinary vectors and
, the projection can be computed as
:

https://wolfram.com/xid/0b0ks2yfpj6-svibl7

https://wolfram.com/xid/0b0ks2yfpj6-84vqns

The projection of u onto v is equivalent to multiplication by an outer product matrix:

https://wolfram.com/xid/0b0ks2yfpj6-s7rhns

https://wolfram.com/xid/0b0ks2yfpj6-02026p

https://wolfram.com/xid/0b0ks2yfpj6-6mltdw

Wolfram Research (2007), Projection, Wolfram Language function, https://reference.wolfram.com/language/ref/Projection.html (updated 2014).
Text
Wolfram Research (2007), Projection, Wolfram Language function, https://reference.wolfram.com/language/ref/Projection.html (updated 2014).
Wolfram Research (2007), Projection, Wolfram Language function, https://reference.wolfram.com/language/ref/Projection.html (updated 2014).
CMS
Wolfram Language. 2007. "Projection." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2014. https://reference.wolfram.com/language/ref/Projection.html.
Wolfram Language. 2007. "Projection." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2014. https://reference.wolfram.com/language/ref/Projection.html.
APA
Wolfram Language. (2007). Projection. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/Projection.html
Wolfram Language. (2007). Projection. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/Projection.html
BibTeX
@misc{reference.wolfram_2025_projection, author="Wolfram Research", title="{Projection}", year="2014", howpublished="\url{https://reference.wolfram.com/language/ref/Projection.html}", note=[Accessed: 31-May-2025
]}
BibLaTeX
@online{reference.wolfram_2025_projection, organization={Wolfram Research}, title={Projection}, year={2014}, url={https://reference.wolfram.com/language/ref/Projection.html}, note=[Accessed: 31-May-2025
]}