Unitary matrix norms that are symmetric under singular value inversion (taking Moore–Penrose inverse)

75 Views Asked by At

Let $\sigma_1(A) \geq \sigma_2(A) \cdots \geq \sigma_{min\{n,m\}}(A) \geq 0$ be singular values of $A \in \mathbb{F}^{n \times m}$ (where $\mathbb{F}$ is either $\mathbb{R}$ or $\mathbb{C}$). Let a unitary invariant norm $$\| \cdot \| : \mathbb{F}^{n \times m} \to \mathbb{R}.$$ Now, any unitary invariant norm can be written as a function of singular values of $A$ [*]. So there exists some function $$f: \mathbb{R}^{\min\{n,m\}} \to \mathbb{R},$$ such that for any $A \in \mathbb{F}^{n \times m}$ we have $$ \| A \| = f(\sigma_1(A), \ldots, \sigma_{\min\{n,m\}}(A)) $$ Next, we call a unitary norm $\| \cdot \|$ symmetric to singular value inversion, if the norm does not change under the transformation: $$ \sigma_i(A) \Longleftrightarrow \frac{1}{\sigma_{\min\{n,m\}-i}(A)}, ~~ \sigma_{i} \neq 0. $$ So if $f$ is the function that realizes the norm, we have:

$$ f(\sigma_1(A), \ldots, \sigma_{\min\{n,m\}}(A)) = f(\sigma_{\min\{n,m\}}^{-1}, \ldots, \sigma_1^{-1}(A))$$


Using Moore–Penrose inverse the symmetric to singular value inversion property can be defined as

$$ \|A\|= \|A^\dagger\| $$


[*] Given a matrix $A$ and $U\Lambda V^\top$, its SVD, use unitary matrices $U$, $V^\top$ to leave out only the singular values $\Lambda$.