I am currently working my way through Poole's Linear Algebra, 4th Edition, and I am hitting a bit of a wall in regards to a particular example in the chapter on least squares solutions. The line $y=a+bx$ that "best fits" the data points $(1,2)$, $(2,2)$, and $(3,4)$ can be related to the (inconsistent) system of linear equations $$a+b=2$$ $$a+2b=2$$ $$a+3b=4$$ with matrix representation $$A\mathbf{x}=\begin{bmatrix}1&1\\1&2\\1&3\\\end{bmatrix}\begin{bmatrix}a\\b\\\end{bmatrix}=\begin{bmatrix}2\\2\\4\\\end{bmatrix}=\mathbf{b}$$ Using the least squares theorem, Poole shows that the least squares solution of the system is $$\overline{\mathbf{x}}=\left(A^T A \right)^{-1} A^T \mathbf{b}=\left(\begin{bmatrix}3&6\\6&14\\\end{bmatrix}\right)^{-1}\begin{bmatrix}8\\18\\\end{bmatrix}=\begin{bmatrix}\frac{7}{3}&-1\\-1&\frac{1}{2}\\\end{bmatrix}\begin{bmatrix}8\\18\\\end{bmatrix}=\begin{bmatrix} \frac{2}{3}\\1\\\end{bmatrix}$$ so that the desired line has the equation $y=a+bx=\frac{2}{3} +x$. The components of the vector $\overline{\mathbf{x}}$ can also be interpreted as the coefficients of the columns of $A$ in the linear combination of the columns of $A$ that produces the projection of $\mathbf{b}$ onto the column space of $A$ [which the Best Approximation Theorem identifies as the best approximation to $\mathbf{b}$ in the subspace $\mathrm{col}(A)$]. In other words, the projection of $\mathbf{b}$ onto $\mathrm{col}(A)$ can be found from the coefficients of $\overline{\mathbf{x}}$ by $$\mathrm{proj}_{\mathrm{col}(A)}(\mathbf{b})=\frac{2}{3}\begin{bmatrix}1\\1\\1\\\end{bmatrix}+1\begin{bmatrix}1\\2\\3\\\end{bmatrix}=\begin{bmatrix}\frac{5}{3}\\\frac{8}{3}\\\frac{11}{3}\\\end{bmatrix}$$ But when I try to calculate $\mathrm{proj}_{\mathrm{col}(A)}(\mathbf{b})$ directly [taking $\mathbf{a}_{1}$ and $\mathbf{a}_{2}$ to be the first and second columns of $A$, respectively], I get $$\mathrm{proj}_{\mathrm{col}(A)}(\mathbf{b})=\left(\frac{\mathbf{a}_{1}\cdot\mathbf{b}}{\mathbf{a}_{1}\cdot\mathbf{a}_{1}}\right)\mathbf{a}_{1}+\left(\frac{\mathbf{a}_{2}\cdot\mathbf{b}}{\mathbf{a}_{2}\cdot\mathbf{a}_{2}}\right)\mathbf{a}_{2}=\left(\frac{\begin{bmatrix}1\\1\\1\\\end{bmatrix}\cdot\begin{bmatrix}2\\2\\4\\\end{bmatrix}}{\begin{bmatrix}1\\1\\1\\\end{bmatrix}\cdot\begin{bmatrix}1\\1\\1\\\end{bmatrix}}\right)\begin{bmatrix}1\\1\\1\\\end{bmatrix}+\left(\frac{\begin{bmatrix}1\\2\\3\\\end{bmatrix}\cdot\begin{bmatrix}2\\2\\4\\\end{bmatrix}}{\begin{bmatrix}1\\2\\3\\\end{bmatrix}\cdot\begin{bmatrix}1\\2\\3\\\end{bmatrix}}\right)\begin{bmatrix}1\\2\\3\\\end{bmatrix}$$ $$=\frac{8}{3}\begin{bmatrix}1\\1\\1\\\end{bmatrix}+\frac{18}{14}\begin{bmatrix}1\\2\\3\\\end{bmatrix}=\begin{bmatrix}\frac{8}{3}\\\frac{8}{3}\\\frac{8}{3}\\\end{bmatrix}+\begin{bmatrix}\frac{9}{7}\\\frac{18}{7}\\\frac{27}{7}\\\end{bmatrix}=\begin{bmatrix}\frac{83}{21}\\\frac{110}{21}\\\frac{137}{21}\\\end{bmatrix}$$ I am quite confident that my calculation is incorrect, for a number of reasons. For example, when I take the component of $\mathbf{b}$ orthogonal to $\mathrm{col}(A)$ $$\mathrm{perp}_{\mathrm{col}(A)}(\mathbf{b})=\mathbf{b}-\mathrm{proj}_{\mathrm{col}(A)}(\mathbf{b})=\begin{bmatrix}2\\2\\4\\\end{bmatrix}-\begin{bmatrix}\frac{83}{21}\\\frac{110}{21}\\\frac{137}{21}\\\end{bmatrix}=\begin{bmatrix}-\frac{41}{21}\\-\frac{68}{21}\\-\frac{53}{21}\\\end{bmatrix}$$ I get a vector that is not perpendicular to either $\mathbf{a}_{1}$ or $\mathbf{a}_{2}$, indicating that this vector is not in the orthogonal complement of $\mathrm{col}(A)$. Can somebody help me identify where I'm going wrong in my attempt to calculate the projection of $\mathbf{b}$ onto $\mathrm{col}(A)$?
2026-03-26 19:00:55.1774551655
Bumbble Comm
On
Where am I going wrong in calculating the projection of a vector onto a subspace?
130 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
Bumbble Comm
On
Aaand I wasn’t using an orthogonal basis for the subspace. The columns of $A$ are linearly independent, which means a least squares solution exists, but they are not orthogonal, which explains why my calculation of the projection of the vector $\mathbf{b}$ onto the column space of $A$ yielded an incorrect result. Applying the Gram-Schmidt Method to the columns of $A$ produces an orthogonal basis for $\mathrm{col}(A)$, which can then be used to calculate the projection.
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in VECTORS
- Proof that $\left(\vec a \times \vec b \right) \times \vec a = 0$ using index notation.
- Constrain coordinates of a point into a circle
- Why is the derivative of a vector in polar form the cross product?
- Why does AB+BC=AC when adding vectors?
- Prove if the following vectors are orthonormal set
- Stokes theorem integral, normal vector confusion
- Finding a unit vector that gives the maximum directional derivative of a vector field
- Given two non-diagonal points of a square, find the other 2 in closed form
- $dr$ in polar co-ordinates
- How to find reflection of $(a,b)$ along $y=x, y = -x$
Related Questions in ORTHOGONALITY
- Functions on $\mathbb{R}^n$ commuting with orthogonal transformations
- Proving set of orthogonal vectors is linearly indpendent
- Find all vectors $v = (x,y,z)$ orthogonal to both $u_1$ and $u_2$.
- Calculus III Vector distance problem.
- Is there a matrix which is not orthogonal but only has A transpose A equal to identity?
- Number of Orthogonal vectors
- Find the dimension of a subspace and the orthogonality complement of another
- Forming an orthonormal basis with these independent vectors
- orthogonal complement - incorrect Brézis definition
- Orthogonal Projection in Inner Product
Related Questions in LEAST-SQUARES
- Is the calculated solution, if it exists, unique?
- Statistics - regression, calculating variance
- Dealing with a large Kronecker product in Matlab
- How does the probabilistic interpretation of least squares for linear regression works?
- Optimizing a cost function - Matrix
- Given matrix $Q$ and vector $s$, find a vector $w$ that minimizes $\| Qw-s \|^2$
- Defects of Least square regression in some textbooks
- What is the essence of Least Square Regression?
- Alternative to finite differences for numerical computation of the Hessian of noisy function
- Covariance of least squares parameter?
Related Questions in PROJECTION
- What's wrong with my reasoning regarding projections
- Finding the orthogonal projection of a vector on a subspace spanned by non-orthogonal vectors.
- Coordinates of camera bounding box projected on another object.
- Bounded projection
- Deriving principal component out of cosine similarity
- Projection onto the space spanned by eigenfunctions in a Hilbert space
- Show that T - I is a projection.
- Pose estimation from 2 points and known z-axis.
- Non orthogonal projection of a point onto a plane
- Mercator projection - Use existing equation to solve for degrees
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
The column space of $A$, namely $U$, is the span of the vectors $\mathbf{a_1}:=(1,1,1)$ and $\mathbf{a_2}:=(1,2,3)$ in $\Bbb R ^3$, and for $\mathbf{b}:=(2,2,4)$ you want to calculate the orthogonal projection of $\mathbf{b}$ in $U$; this is done by $$ \operatorname{proj}_U \mathbf{b}=\langle \mathbf{b},\mathbf{e_1} \rangle \mathbf{e_1}+\langle \mathbf{b},\mathbf{e_2} \rangle \mathbf{e_2}\tag1 $$ where $\mathbf{e_1}$ and $\mathbf{e_2}$ is some orthonormal basis of $U$ and $\langle \mathbf{v},\mathbf{w} \rangle:=v_1w_1+v_2w_2+v_3 w_3$ is the Euclidean dot product in $\Bbb R ^3$, for $\mathbf{v}:=(v_1,v_2,v_3)$ and $\mathbf{w}:=(w_1,w_2,w_3)$ any vectors in $\Bbb R ^3$.
Then you only need to find an orthonormal basis of $U$; you can create one from $\mathbf{a_1}$ and $\mathbf{a_2}$ using the Gram-Schmidt procedure, that is $$ \mathbf{e_1}:=\frac{\mathbf{a_1}}{\|\mathbf{a_1}\|}\quad \text{ and }\quad \mathbf{e_2}:=\frac{\mathbf{a_2}-\langle \mathbf{a_2},\mathbf{e_1} \rangle \mathbf{e_1}}{\|\mathbf{a_2}-\langle \mathbf{a_2},\mathbf{e_1} \rangle \mathbf{e_1}\|}\tag2 $$ where $\|{\cdot}\|$ is the Euclidean norm in $\Bbb R ^3$, defined by $\|\mathbf{v}\|:=\sqrt{\langle \mathbf{v},\mathbf{v} \rangle}=\sqrt{v_1^2+v_2^2+v_3^2}$.
Your mistake is that you assumed that $$ \operatorname{proj}_U\mathbf{b}=\frac{\langle \mathbf{b},\mathbf{a_1} \rangle}{\|\mathbf{a_1}\|^2}\mathbf{a_1}+ \frac{\langle \mathbf{b},\mathbf{a_2} \rangle}{\|\mathbf{a_2}\|^2}\mathbf{a_2}\tag3 $$ however this is not true because $\mathbf{a_1}$ and $\mathbf{a_2}$ are not orthogonal.