How does the Modified Gram Schmidt works? I want to use it but I am confused by the notations and I could not find any example online.
2026-02-22 19:45:36.1771789536
Modified Gram Schmidt
3.3k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in ORTHONORMAL
- Orthonormal basis for $L^2(\mathbb{R}^n,\mathbb{F})$
- What is $\| f \|$ where $f(x)=\sum\limits_{n=1}^\infty \frac{1}{3^n} \langle x,e_n\rangle$
- Forming an orthonormal basis with these independent vectors
- Orthogonal Function Dirac Delta Series
- Sum of two rank $1$ matrices with some property gives rank $2$ matrix
- Zero element in an Hilbert space is orthogonal?
- Prove that $\lVert X\rVert^2 =\sum_{i,j=1}^\infty\lvert\langle u_i,Xu_j\rangle\rvert^2$.
- Is there any connection between the fact that a set of vectors are mutually orthogonal and the same set of vectors are linearly independent
- Compute the norm of a linear operator using a normal basis in an infinite Hilbert space
- If $M$ is the span of a finite orthonormal set in a Hilbert space then $M$ is closed
Related Questions in GRAM-SCHMIDT
- Finding the orthogonal projection of a vector on a subspace spanned by non-orthogonal vectors.
- Orthogonal Function Dirac Delta Series
- Legendre polynomials: show that two algorithms construct the same polynomials
- Is there a more convenient method for converting a base to be orthogonal than Gram Schmidt?
- Gram Schmidt Process with inner product $\langle z,w\rangle = 3(z_1)(\bar{w_1}) + 2(z_2)(\bar{w_2})+i(z_1)(\bar{w_2})-i(z_2)(\bar{w_1})$
- Gram-Schmidt orthonormal basis
- Orthonormal Basis of Hyperplane
- History Question - QR Factorization
- Dimension of an orthonormal basis
- Determine Orthonormal basis in $R^4$ without calculator
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
First, let's establish Gram Schmidt (sometimes called Classical GS) to be clear.
We use GS because we wish to solve the system $A \overrightarrow{x} = \overrightarrow{b}$. We want to compute $\overrightarrow{x}$ s.t. $||\overrightarrow{r}||_2$ is minimized where $\overrightarrow{r} = A\overrightarrow{x} - \overrightarrow{b}$.
One way is GS, where we define $A =QR$ s.t. $Q^TQ=I$ where $I$ is the identity matrix of size n x n and $R$ is an upper right triangular matrix of size n x n.
Our goal is to find $$\{ \overrightarrow{q}_1 , \overrightarrow{q}_2 , \cdots, \overrightarrow{q}_n \}$$ s.t. $$span \{ \overrightarrow{q}_1 , \overrightarrow{q}_2 , \cdots, \overrightarrow{q}_n \} = span \{\overrightarrow{a}_1 , \overrightarrow{a}_2 , \cdots, \overrightarrow{a}_n \}$$
Also, let us define the dot product as, $$<a,b>$$
Then we want
$$<\overrightarrow{q}_i , \overrightarrow{q}_j > \quad = 0 \quad if \quad i \not=j$$
$$<\overrightarrow{q}_i , \overrightarrow{q}_j > \quad = 1 \quad if \quad i = j$$
Suppose $\{ \overrightarrow{q}_1 , \overrightarrow{q}_2 , \cdots, \overrightarrow{q}_r \}$ are already found, then we can find $\overrightarrow{q}_{r+1}$ by the following
$$\overrightarrow{q}_{r+1} = \overrightarrow{a}_{r+1} - <\overrightarrow{a}_{r+1},\overrightarrow{q}_1>\overrightarrow{q}_1 - <\overrightarrow{a}_{r+1},\overrightarrow{q}_2>\overrightarrow{q}_2 - \cdots - <\overrightarrow{a}_{r+1},\overrightarrow{q}_r>\overrightarrow{q}_r$$
In other words, the above is our generic case on how to solve for $\overrightarrow{q}_i$.
Let's look at an example:
$$ \overrightarrow{a}_1 = \begin{pmatrix} 1 \\ 2 \\ 2 \end{pmatrix}$$
$$ \overrightarrow{a}_2 = \begin{pmatrix} -1 \\ 0 \\ 2 \end{pmatrix}$$
$$ \overrightarrow{a}_3 = \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}$$
It is important to note this process only works if our columns are linearly independent. You can check this by taking the determinant of our matrix A. If it $\not= 0$, it's linearly independent.
$$det \begin{pmatrix} 1 & -1 & 0 \\ 2 & 0 & 0 \\ 2 & 2 & 1 \end{pmatrix} = 3$$
We are good, so let's continue.
$$r_{11} = ||\overrightarrow{a}_1 ||_2 = (1^2 + 2^2 +2^2)^{\frac{1}{2}} = 3$$
$$\overrightarrow{q}_1 = \frac{\overrightarrow{a}_1}{r_{11}} = \frac{1}{2} \begin{pmatrix} 1 \\ 2\\ 2 \end{pmatrix}$$
$$\overrightarrow{q}_2 = \overrightarrow{a}_2 - <\overrightarrow{a}_2, \overrightarrow{q}_1> \overrightarrow{q}_1$$
$$\overrightarrow{q}_2 = \begin{pmatrix} -1 \\ 0 \\ 2 \end{pmatrix} - \left(\frac{-1}{3} + (2) \left( \frac{2}{3} \right) \right) \frac{1}{3} \begin{pmatrix} 1 \\ 2 \\ 2 \end{pmatrix}$$
$$\overrightarrow{q}_2 = \begin{pmatrix} \frac{-4}{3} \\ \frac{-2}{3} \\ \frac{4}{3} \end{pmatrix}$$
Now $\overrightarrow{q}_2$ needs to be normalized.
$$\overrightarrow{q}_2 = \frac{\overrightarrow{q}_2}{||\overrightarrow{q}_2||_2} = \begin{pmatrix} \frac{-2}{3} \\ \frac{-1/3} \\ \frac{2}{3} \end{pmatrix}$$
Our left hand side $\overrightarrow{q}_2$ is our new $\overrightarrow{q}_2$ value.
Now we need to solve for $\overrightarrow{q}_3$.
$$\overrightarrow{q}_3 = \overrightarrow{a}_3 - < \overrightarrow{a}_3 , \overrightarrow{q}_1 > \overrightarrow{q}_1 - <\overrightarrow{a}_3 , \overrightarrow{q}_2> \overrightarrow{q}_2$$
$$\overrightarrow{q}_3 = \begin{pmatrix} 0 \\ 0\\ 1 \end{pmatrix} - \frac{2}{3} \begin{pmatrix} \frac{1}{3} \\ \frac{2}{3} \\ \frac{2}{3} \end{pmatrix} - \begin{pmatrix} \frac{-2}{3} \\ \frac{-1}{3} \\ \frac{2}{3} \end{pmatrix} $$
$$\overrightarrow{q}_3 = \begin{pmatrix} \frac{2}{9} \\ \frac{-2}{9} \\ \frac{1}{9} \end{pmatrix}$$
Once again, it needs to be normalized. $\overrightarrow{q}_3$ becomes
$$\overrightarrow{q}_3 = \frac{\overrightarrow{q}_3}{||\overrightarrow{q}_3||_2} = \begin{pmatrix} \frac{2}{3} \\ \frac{-2}{3} \\ \frac{1}{3} \end{pmatrix}$$
Now $r_{ij} = < \overrightarrow{a}_j , \overrightarrow{q}_i>$ when $i \not=j$ and $r_{ii} = || \overrightarrow{a}||_2$
With this knowledge we find $r_{12} = 1$, $r_{22} = 2$, $r_{13} = \frac{2}{3}$, $r_{23} = \frac{2}{3}$, and $r_{33} = \frac{1}{3}$
Now we have all the pieces and parts.
$$ Q = \begin{pmatrix} = \frac{1}{3} & \frac{-2}{3} & \frac{2}{3} \\ \frac{2}{3} & \frac{-1}{3} & \frac{-2}{3} \\ \frac{2}{3} & \frac{2}{3} & \frac{1}{3} \end{pmatrix}$$
and
$$ R = \begin{pmatrix} = 3 & 1 & \frac{2}{3} \\ 0 & 2 & \frac{2}{3} \\ 0 & 0 & \frac{1}{3} \end{pmatrix}$$
Remember that we wanted $A = QR$, so now we have
$$ A = \begin{pmatrix} = \frac{1}{3} & \frac{-2}{3} & \frac{2}{3} \\ \frac{2}{3} & \frac{-1}{3} & \frac{-2}{3} \\ \frac{2}{3} & \frac{2}{3} & \frac{1}{3} \end{pmatrix} \begin{pmatrix} = 3 & 1 & \frac{2}{3} \\ 0 & 2 & \frac{2}{3} \\ 0 & 0 & \frac{1}{3} \end{pmatrix}$$
However, it's a good idea to check if $QR$ actually equals $A$ (ours does). At this point you can use QR factorization to solve.
However, that's not the point of this solution.
Modified Version of Gram Schmidt:
In classical GS we solve for $\overrightarrow{q}_j$ directly. In Modified GS we take multiple steps to get to $\overrightarrow{q}_j$. This takes care of errors like floating point errors when solving by computer $$\overrightarrow{q}_j^0 = \overrightarrow{a}_j$$
$$\overrightarrow{q}_j^1 = \overrightarrow{q}_j^0 - <\overrightarrow{q}_j^0 , \overrightarrow{q}_1> \overrightarrow{q}_1$$
$$\overrightarrow{q}_j^2 = \overrightarrow{q}_j^1 - <\overrightarrow{q}_j^1 , \overrightarrow{q}_2> \overrightarrow{q}_2$$
$$ \cdots$$
$$\overrightarrow{q}_j^{j-1} = \overrightarrow{q}_j^{j-2} - <\overrightarrow{q}_j^{j-2} , \overrightarrow{q}_{j-1}> \overrightarrow{q}_{j-1}$$
And as with Classical GS, it needs to be normalized to give a final result of
$$\overrightarrow{q}_j = \frac{\overrightarrow{q}_j^{j-1}}{||\overrightarrow{q}_j^{j-1}||_2}$$
The key to Modified GS is every $\overrightarrow{q}_i$ needs to be solved this way. The values of $r_{ij}$ and $\overrightarrow{q}_i$ (note this is $\overrightarrow{q}$ with no superscript) are still solved the same as in Classical GS. Again, solve for $Q$ and $R$ and then solve by QR Factorization.