System of equations - solution for a matrix

76 Views Asked by At

Can I find a $C$, explicitly, for given $A$ and $B$ satifying $$ \sqrt{\pmb{t}^TAA^{T}\pmb{t}} + \sqrt{\pmb{t}^TBB^{T}\pmb{t}} = \sqrt{\pmb{t}^TCC^{T}\pmb{t}}, $$ for all possible choices of $\pmb{t}$.

The idea of taking different choices of $\pmb{t}$ to find an explicit solution of $C$ can be implemented, but I could not find $C$ explicitly in terms of $A$ and $B$ following this idea. Any help will be greatly appreciated.

Note: $A$, $B$, and $C$ are real square matrices and $A^{T}$ indicates the transpose of the matrix $A$. Furthermore $\pmb{t}$ denotes a conformable column-vector.

2

There are 2 best solutions below

0
On

Let $M = CC^T$. Let $f(\pmb t) = \sqrt{\pmb{t}^TAA^{T}\pmb{t}} + \sqrt{\pmb{t}^TBB^{T}\pmb{t}}$. Our matrix $M$ is a symmetric matrix which meant to satisfy $$ f(\pmb t)^2 = \pmb{t}^T M \pmb{t} $$ for all vectors $\pmb{t}$. Let $e_1,\dots,e_n$ denote the standard basis vectors. We note that $$ e_i^T M e_i = m_{ii},\qquad (e_i + e_j)^TM(e_i + e_j) = m_{ii} + 2m_{ij} + m_{jj} $$ Thus, if some satisfactory $M$ exists, it must satisfy $$ m_{ii} = f(e_i)^2, \qquad m_{ij} = \frac 12 [f(e_i + e_j)^2 - f(e_i)^2 - f(e_j)^2] \quad i \neq j $$

You should find it straight forward to write these expressions of $f$ explicitly in terms of the entries of $A$ and $B$. For instance, $$ m_{ii} = f(e_i)^2 = (|a_{ii}| + |b_{ii}|)^2 $$ the expression for $m_{ij}$ can be computed similarly.

Once you have this candidate $M$, it remains to be seen whether $\pmb{t}^TM \pmb{t} = f(\pmb{t})^2$ generally holds.

From $M$, $C$ can be obtained as the lower triangular matrix from the Cholesky decomposition $M = CC^T$.

0
On

Let $|A|=\sqrt{AA^T}$, the unique positive semidefinite square root of $AA^T$. Define $|B|$ and $|C|$ analogously. The equation in question is then equivalent to $\||A|u\|+\||B|u\|=\||C|u\|$. Squaring both sides, we get $$ \||C|u\|^2-\||A|u\|^2-\||B|u\|^2 = 2\||A|u\|\||B|u\| \ge 0.\tag{1} $$ The LHS of $(1)$ is a quadratic form. Therefore, the original equation in question is solvable if and only if $$ \||A|u\|\||B|u\| = \|Pu\|^2\tag{2} $$ for some positive semidefinite matrix $P$. Note that if $(2)$ holds, we must have $\ker|A|, \ker|B|\subseteq\ker P$. But then for any $u\in\ker|A|$ and $v\in\ker|B|$, we would have $$ \||A|v\|\||B|u\| = \||A|(u+v)\|\||B|(u+v)\| = \|P(u+v)\|^2 = 0. $$ Therefore one of the kernels of $|A|$ of $|B|$ must be a subset of the other.

Suppose $\ker|B|\subseteq\ker|A|$. Let $V=\ker|B|^\perp$. Then $|B|$ is positive definite on $V$ and $|A|$ is positive semidefinite on $V$. Thus $(2)$ holds if and only if there exists a positive semidefinite matrix $S$ such $$ \|Hv\|\|v\| = \|Sv\|^2\tag{3} $$ for every $v\in V$, where $H=\sqrt{|B|^{-1}|A|^2|B|^{-1}}$ and $S=\sqrt{|B|^{-1}P^2|B|^{-1}}$ on $V$.

When $v$ is the unit eigenvector corresponding to the largest eigenvalue of the restriction of $S$ on $V$, the RHS of $(3)$ is maximised. Hence the LHS must be maximised too and in turn, $H,S$ share a common eigenvector for their largest eigenvalues. Apply a similar argument on $\left(\ker|B|+\operatorname{span}\{v\}\right)^\perp$, we see that $H$ and $S$ also share a common eigenvector for their respective second largest eigenvalues. Continue this way, we see that $H$ and $S$ share the same orthonormal eigenbasis.

Let $\lambda_1\ge\lambda_2\ge\cdots\ge\lambda_k$ be the eigenvalues of the restriction of $S$ on $V$. Equation $(3)$ implies that $Hv=\lambda_i^2v$ whenever $(\lambda_i,v)$ is an eigenpair of $S$ on $V$. Now, if $u$ and $w$ are two orthonormal eigenvectors of $S$ corresponding to the eigenvalues $\lambda_i$ and $\lambda_j$ respectively, then by putting $v=\cos(t)u+\sin(t)w$ in $(3)$, we get $$ \sqrt{\lambda_i^4\cos^2(t)+\lambda_j^4\sin^2(t)}=\lambda_i^2\cos^2(t)+\lambda_j^2\sin^2(t)\tag{4} $$ for every $t$. It follows that $\lambda_i=\lambda_j$. Hence $H$ is a scalar multiple of the identity matrix on $V$. But that means $|A|^2$ is a scalar multiple of $|B|^2$ on $V$. In turn, $|A|^2$ is a scalar multiple of $|B|^2$ on the whole Euclidean space, because $|A|=|B|=0$ on $V^\perp$.

In short, the original equation is solvable in $C$ only in the trivial case that one of $AA^T$ and $BB^T$ is a scalar multiple of the other.