Let $A\in\mathbb R^{n\times n}$, $B\in\mathbb R^{m\times m}$ be symmetric, positive definite, matrices. Let $C = A\otimes B$ be their tensor product. I want to compute the Cholesky decomposition of $C$. Suppose I have available the Cholesky decompositions of $A$ and $B$. Is there a way to exploit this information in computing the decomposition of $C$?
2025-01-13 02:44:46.1736736286
Cholesky decomposition of tensor product
874 Views Asked by a06e https://math.techqa.club/user/a06e/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- Proving a set S is linearly dependent or independent
- An identity regarding linear operators and their adjoint between Hilbert spaces
- Show that $f(0)=f(-1)$ is a subspace
- Find the Jordan Normal From of a Matrix $A$
- Show CA=CB iff A=B
- Set of linear transformations which always produce a basis (generalising beyond $\mathbb{R}^2$)
- Linear Algebra minimal Polynomial
- Non-singularity of a matrix
- Finding a subspace such that a bilinear form is an inner product.
- Is the row space of a matrix (order n by m, m < n) of full column rank equal to $\mathbb{R}^m$?
Related Questions in TENSOR-PRODUCTS
- Tensor product of two fields
- Basis of a Tensor Product
- Coalgebra differential on reduced symmetric algebra
- Double dot product in index notation
- Tensor product isomorphism
- If $\{a_{1},\ldots,a_{n}\}$ is a $K$-basis of $F$, then $\{1\otimes a_{1},\ldots,1\otimes a_{n}\}$ is an $E$-basis of $E\otimes_{K} F$.
- Converting a general tensor element into an elementary tensor
- The $-1$-eigenspace of the exchange map is $V \wedge V$
- Associativity of extension of scalar (tensor product)
- How does a Lie algebra act on a tensor product of L-modules?
Related Questions in MATRIX-DECOMPOSITION
- Matrix values increasing after SVD, singular value decomposition
- Preferred matrix decomposition
- Computing columns of a pseudo-inverse
- How to calculate $(I + GH)^{-1}$?
- Find the inverse of $A+uB+vC+uvD+u^2E+v^2F$ where $A,B,C,D,E,F$ are symmetric.
- Updating QR or Cholesky matrix decomposition after adding a column to the original matrix
- What are the different ways of matrix-vector multiplication when the matrix has the given form?
- Largest set of matrices that have unique root
- rank of product of full rank matrices
- Confusion with orthogonal matrices
Related Questions in KRONECKER-PRODUCT
- Quadratic form of Kroenecker products of skew-symmetric matrices
- derivative of kronecker product
- Matrix product representation of double summations
- Discrepancy between tensor product and Kronecker product?
- Norm of matrix $M= u \otimes v^*$ if $u$ and $v$ are unit norm vectors
- Is $v^* H w= h^T (w \otimes (v^*)^T)$ in this specific case?
- Loop integral with Kronecker-delta notation
- Can **Kronecker product** explained operators in matrices?
- Convenient representation for matrix equation through rvec/devec operator and kronecker
- Kronecker delta representation of a matrix (Quantum raising / lowering operators)
Related Questions in CHOLESKY-DECOMPOSITION
- Preferred matrix decomposition
- Inverse of the sum of a invertible matrix with known Cholesky-decomposion and diagonal matrix
- Eigenvalues using cholesky factors
- Cholesky decomposition of tensor product
- Cholesky decomposition for special structure matrix
- Determining whether a matrix is positive definite from its LU decomposition
- Is a symmetric matrix positive definite iff $D$ in its LDU decomposition is positive definite?
- Positive- definiteness of factors of Cholesky factorization
- If A positive definite then $\lvert\det(A)| \leq \prod_{i=1}^n \Vert a_{i}\Vert$
- How can we take use Cholesky factorization to show that a given symmetric matrix, M is positive semi-definite?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Refuting the Anti-Cantor Cranks
- Find $E[XY|Y+Z=1 ]$
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- What are the Implications of having VΩ as a model for a theory?
- How do we know that the number $1$ is not equal to the number $-1$?
- Defining a Galois Field based on primitive element versus polynomial?
- Is computer science a branch of mathematics?
- Can't find the relationship between two columns of numbers. Please Help
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- A community project: prove (or disprove) that $\sum_{n\geq 1}\frac{\sin(2^n)}{n}$ is convergent
- Alternative way of expressing a quantied statement with "Some"
Popular # Hahtags
real-analysis
calculus
linear-algebra
probability
abstract-algebra
integration
sequences-and-series
combinatorics
general-topology
matrices
functional-analysis
complex-analysis
geometry
group-theory
algebra-precalculus
probability-theory
ordinary-differential-equations
limits
analysis
number-theory
measure-theory
elementary-number-theory
statistics
multivariable-calculus
functions
derivatives
discrete-mathematics
differential-geometry
inequality
trigonometry
Popular Questions
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- How to find mean and median from histogram
- Difference between "≈", "≃", and "≅"
- Easy way of memorizing values of sine, cosine, and tangent
- How to calculate the intersection of two planes?
- What does "∈" mean?
- If you roll a fair six sided die twice, what's the probability that you get the same number both times?
- Probability of getting exactly 2 heads in 3 coins tossed with order not important?
- Fourier transform for dummies
- Limit of $(1+ x/n)^n$ when $n$ tends to infinity
A Cholesky decomposition of a matrix $M$ is a factorization of form $M=LL^T$, where $L$ is lower triangular and $T$ denotes transpose.
So given Cholesky decompositions $A=\alpha\alpha^T$ and $B=\beta\beta^T$ one can obtain the factorization $A\otimes B = (\alpha \otimes \beta)(\alpha^T\otimes\beta^T)$. To see that this actually is a Cholesky decomposition one checks that $L=\alpha \otimes \beta$ is lower triangular. Inspection of examples should make this clear, but the only formal proof I can think of is notationally complex, involving expressions relating row numbers of $L$ to those of $\alpha$ and $\beta$ with formulas like $k=mi+j$ and statements like $mi+j>mr+s$ implies at least one of $i>r$ and $j>s$ holds.