Let`s have a square symmetric matrix A, and I wish to decompose it into two diagonalizable matrices so that: $$A=M^TM$$ where, $$M$$ and $$M^T$$ are; a diagonalizable matrix and its transpose. First, I tried decomposing A using PCA (principle component analysis); $$A=V\sigma V^T$$ This gives column vectors V and their corresponding eigen values. But then I can not be able to solve for M using Singular value Decomposition (SVD) because I do not have U: $$M=U\sqrt\sigma V^T$$ So what is the way out? Is there any sharp method of obtaining U and hence M or any numerical approximation method instead? Also I am interested to know whether such decomposition can be applied to any square symmetric matrix in general with no exception?
2025-04-17 01:07:35.1744852055
How to decompose a square symmetric matrix into two diagonalizable matrices provided that one of them is the transpose of the other?
1.6k Views Asked by Isaacadel https://math.techqa.club/user/isaacadel/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- Proving a set S is linearly dependent or independent
- An identity regarding linear operators and their adjoint between Hilbert spaces
- Show that $f(0)=f(-1)$ is a subspace
- Find the Jordan Normal From of a Matrix $A$
- Show CA=CB iff A=B
- Set of linear transformations which always produce a basis (generalising beyond $\mathbb{R}^2$)
- Linear Algebra minimal Polynomial
- Non-singularity of a matrix
- Finding a subspace such that a bilinear form is an inner product.
- Is the row space of a matrix (order n by m, m < n) of full column rank equal to $\mathbb{R}^m$?
Related Questions in MATRICES
- Show CA=CB iff A=B
- What is the correct chain rule for composite matrix functions?
- Is the row space of a matrix (order n by m, m < n) of full column rank equal to $\mathbb{R}^m$?
- How to show that if two matrices have the same eigenvectors, then they commute?
- Linear Algebra: Let $w=[1,2,3]_{L_1}$. Find the coordinates of w with respect to $L$ directly and by using $P^{-1}$
- How to prove the cyclic property of the trace?
- Matrix expression manipulation
- Matrix subring isomorphic to $\mathbb{C}$
- Is the ellipsoid $x'Qx < \alpha$ equivalent to $\alpha Q^{-1} - x x' \succ 0$?
- Show that matrix $M$ is not orthogonal if it contains column of all ones.
Related Questions in SYMMETRIC-MATRICES
- How to break the quadratic form $x^TABx + x^TB^TAx$?
- Real symmetric matrices have only real eigenvalues — is this an incorrect proof?
- Linear transformation $T: M_{3\times3}\to M_{3\times3}$ defined by $T(A) = 1/2(A+A^{\top})$. Determine a basis for the kernel of this mapping.
- Loewner ordering of symetric positive definite matrices and their inverse
- Finding the null space of symmetric matrix generated by outer product
- Best algorithm to compute the first eigenvector of symmetric matrix
- Prove that the Hilbert matrix $H_5$ has five positive eigenvalues
- Confused by the SVD of a real symmetric matrix
- Why must all the principal sub-matrices' determinants be positive for the matrix to be positive definite?
- Symmetric block matrix related
Related Questions in POSITIVE-SEMIDEFINITE
- Positive semidefiniteness of a block matrix
- Relationship between rank and positive semidefiniteness
- How to decompose a square symmetric matrix into two diagonalizable matrices provided that one of them is the transpose of the other?
- Hessian matrix for convexity of multidimensional function
- Semidefinite solutions to Lyapunov equation
- Is the self-adjoint condition required in the definition of a positive operator?
- Is a sinc-distance matrix positive semidefinite?
- Minimize $\mbox{trace}(AX)$ over $X$ with a positive semidefinite $X$
- Why is this determinant positive?
- Decomposition of a positive semidefinite matrix
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Refuting the Anti-Cantor Cranks
- Find $E[XY|Y+Z=1 ]$
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- What are the Implications of having VΩ as a model for a theory?
- How do we know that the number $1$ is not equal to the number $-1$?
- Defining a Galois Field based on primitive element versus polynomial?
- Is computer science a branch of mathematics?
- Can't find the relationship between two columns of numbers. Please Help
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- A community project: prove (or disprove) that $\sum_{n\geq 1}\frac{\sin(2^n)}{n}$ is convergent
- Alternative way of expressing a quantied statement with "Some"
Popular # Hahtags
real-analysis
calculus
linear-algebra
probability
abstract-algebra
integration
sequences-and-series
combinatorics
general-topology
matrices
functional-analysis
complex-analysis
geometry
group-theory
algebra-precalculus
probability-theory
ordinary-differential-equations
limits
analysis
number-theory
measure-theory
elementary-number-theory
statistics
multivariable-calculus
functions
derivatives
discrete-mathematics
differential-geometry
inequality
trigonometry
Popular Questions
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- How to find mean and median from histogram
- Difference between "≈", "≃", and "≅"
- Easy way of memorizing values of sine, cosine, and tangent
- How to calculate the intersection of two planes?
- What does "∈" mean?
- If you roll a fair six sided die twice, what's the probability that you get the same number both times?
- Probability of getting exactly 2 heads in 3 coins tossed with order not important?
- Fourier transform for dummies
- Limit of $(1+ x/n)^n$ when $n$ tends to infinity
I think I found the way out. I have mentioned in my first post that I would have calculated M if I had U because, $$M=U\sqrt\sigma V^T$$ In fact, U itself is an orthonormal matrix which means I can choose an arbitrarily orthonormal matrix and plug it in the SVD equation above to yield M. There are two notes here: 1) M will not be unique. 2) $$M^TM=A$$ which is my original square symmetric matrix because $$U^TU=I$$ So steps to decompose a square symmetric matrix A into $$M^TM$$. 1) Decompose the square symmetric matrix using prinicple components analysis. $$M^TM=V\sigma V^T$$. 2) Use $$V and \sqrt\sigma$$ matrices to construct M. 3) Multiply the matrices in step 2 by U which is an orthonormal matrix to yield; $$M=U\sqrt\sigma V^T$$. Done.