Orthogonal matrices and matrix norms

2.3k Views Asked by At

I have seen some disagreement online and was wondering if anyone could clarify for me:

If $X$ is an arbitrary $n \times n$ matrix and $A$ is an arbitrary orthogonal $n \times n$ matrix, is it true that $$\| AX \|_p = \|X\|_p$$

For all $p \in \mathbb{Z}_+\cup{\infty}$, where $\|\cdot\|_p$ is the matrix $p$-norm defined as:

$$\| A \|_p = \sup_{x \neq 0} \frac{|Ax|_p}{|x|_p}$$

Where $|\cdot|_p$ is the $p$-norm (of vector).

2

There are 2 best solutions below

3
On BEST ANSWER

Proposition: $\left\lVert UAW\right\lVert=||A||$ if $U$ and $W$ are orthogonal or unitary, for the Frobenius norm and for the operator norm induced by the vector norm $\left\lVert\cdot\right\lVert_2.$

Let's start with the Frobenius norm: Using the trace characterization,

\begin{align*} \left\lVert UAW\right\lVert_F&=\text{tr} (UAWW^TA^TQ^T)=\text{tr}(UAA^TU^T)\\ &=\text{tr}(A^TU^TU^TA)=\text{tr}(AA^T)\\ &=\left\lVert A\right\lVert_F\end{align*}.

Next, for the operator norm induced by the $2$-norm:

\begin{align*} \left\lVert UAW\right\lVert_2 &=\max_{x\neq 0}\frac{\left\lVert UAWx \right\lVert_2}{\left\lVert x\right\lVert_2}=\max_{x\neq 0}\frac{\sqrt{x^TW^TA^TU^TUAWx}}{\sqrt{x^Tx}}\\ &=\max_{z\neq 0}\frac{\sqrt{zA^TAz}}{\sqrt{z^Tz}}=\max_{z\neq 0}\frac{\left\lVert Az\right\lVert_2}{\left\lVert z\right\lVert_2}=\left\lVert A\right\lVert_2, \end{align*} where we used the substitution $z=Wx.$

For an example to show that the infinity and $1$-norm do not work, take $U$ to be rotation matrix which rotates counterclockwise by $60$ degrees, $A=\begin{bmatrix} 1 & 2 \\ 3 & 4\end{bmatrix},$ and $W=I$. Then, $$UAW=\begin{bmatrix} \frac{1}{2}-\frac{3\sqrt{3}}{2} & 1-2\sqrt{3} \\ \frac{3}{2}+\frac{\sqrt{3}}{2} & 2+\sqrt{3}\end{bmatrix}.$$

5
On

Why would that be true for any matrices $A,X$ in general. It is true that

$$ \| AX \|_{p} \leq ||A\|_{p} \|X\|_{p} $$

when $p=2$ and $A$ is orthogonal then this is true

$$ || A X ||_{2} = ||X||_{2} $$

the $2$ norm is invariant for orthogonal matrices. If you're looking for a proof I'm pretty sure this has been asked before.

Here is a counter example. Suppose that $A=X$ and

$$ A = \begin{bmatrix} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end{bmatrix}$$

and let $\theta = \frac{\pi}{4}$ $$ A = \begin{bmatrix} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \end{bmatrix}$$

$$ A \cdot A = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}$$

and the $1$ norm is given as

$$ \|A\|_{1} = \max_{1 \leq j \leq n} \sum_{i=1}^{m} |a_{ij}| $$

is the max column sum of $A$

$$ \|A\|_{1} = \frac{2}{\sqrt{2}} $$

$$ \|B\|_{1} = 1$$