Is there a faster way to calculate a pseudo-inverse of a matrix than using SVD that is as numerically stable as using SVD?
2026-03-26 06:22:18.1774506138
Is there a faster way to calculate a pseudo-inverse of a matrix than using SVD that is as numerically stable as with SVD?
1.9k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in INVERSE
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Proving whether a matrix is invertible
- Proof verification : Assume $A$ is a $n×m$ matrix, and $B$ is $m×n$. Prove that $AB$, an $n×n$ matrix is not invertible, if $n>m$.
- Help with proof or counterexample: $A^3=0 \implies I_n+A$ is invertible
- Show that if $a_1,\ldots,a_n$ are elements of a group then $(a_1\cdots a_n)^{-1} =a_n^{-1} \cdots a_1^{-1}$
- Simplifying $\tan^{-1} {\cot(\frac{-1}4)}$
- Invertible matrix and inverse matrix
- show $f(x)=f^{-1}(x)=x-\ln(e^x-1)$
- Inverse matrix for $M_{kn}=\frac{i^{(k-n)}}{2^n}\sum_{j=0}^{n} (-1)^j \binom{n}{j}(n-2j)^k$
- What is the determinant modulo 2?
Related Questions in NUMERICAL-LINEAR-ALGEBRA
- sources about SVD complexity
- Showing that the Jacobi method doesn't converge with $A=\begin{bmatrix}2 & \pm2\sqrt2 & 0 \\ \pm2\sqrt2&8&\pm2\sqrt2 \\ 0&\pm2\sqrt2&2 \end{bmatrix}$
- Finding $Ax=b$ iteratively using residuum vectors
- Pack two fractional values into a single integer while preserving a total order
- Use Gershgorin's theorem to show that a matrix is nonsingular
- Rate of convergence of Newton's method near a double root.
- Linear Algebra - Linear Combinations Question
- Proof of an error estimation/inequality for a linear $Ax=b$.
- How to find a set of $2k-1$ vectors such that each element of set is an element of $\mathcal{R}$ and any $k$ elements of set are linearly independent?
- Understanding iterative methods for solving $Ax=b$ and why they are iterative
Related Questions in SVD
- Singular values by QR decomposition
- Are reduced SVD and truncated SVD the same thing?
- Clarification on the SVD of a complex matrix
- Sufficient/necessary condition for submatrix determinant (minor) that decreases with size?
- Intuitive explanation of the singular values
- SVD of matrix plus diagonal matrix and inversed
- Fitting a sum of exponentials using SVD
- Solution to least squares problem
- Are all three matrices in Singular Value Decomposition orthornormal?
- Solving linear system to find weights in $[0,1]$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Since the dominant term in the pseudo-inverse is the smallest nonzero singular value of the matrix, this would require stably producing the singular values (or approximate singular values) of the matrix. While this can be done without doing the full SVD, it can never be as stable, since any error in the smallest singular value would lead to large errors in the pseudoinverse.
I should note that there are ways to calculate the pseudoinverse without the SVD, if you know additional structure. For example, if $A$ is invertible, the pseudoinverse is the inverse, so you can just use LU factorization. Or if $A$ has full column rank, then you could use the QR factorization.