Consider the mapping $T:\mathbb{R}^n\mapsto\mathbb{R}^n$ defined by $T(\vec{x})=A\vec{x}$ where $A$ is a $n\times n$ matrix. Find the necessary and sufficient conditions on $A$ such that $\|T(\vec{x})\|=|\det A|\cdot\|\vec{x}\|$ for all $\vec{x}$. Here $\|\cdot\|$ denotes the norm.
Find the necessary and sufficient conditions on $A$ such that $\|T(\vec{x})\|=|\det A|\cdot\|\vec{x}\|$ for all $\vec{x}$.
256 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
Hint: If $\det A=0$, then everything is obvious. Otherwise consider $S=|\det A|^{-1}T$. It is norm preserving hence it is
unitary operator
So the following equalities holds
$SS^T=S^TS=I$
And you can rewrite them in terms of $T$.
On
The condition is satisfied iff $T=0$ (ie, $A=0$) or $T$ is unitary (ie, $A$ is unitary).
If $T=0$ or $T$ is unitary, it is obvious that the condition is satisfied.
If $\det A =0 $, then clearly $T=0$, so suppose $\det A \neq 0$. Then $U = \frac{1}{\det A} A$ is a unitary operator. If $\lambda_1,...,\lambda_n$ are the eigenvalues of $A$, then the eigenvalues of $U$ are $\frac{\lambda_k}{\lambda_1 \cdots \lambda_n}$, and all have modulus $1$. It follows that $|\lambda_k|$ is a constant, and from this it follows that $|\lambda_k|=1$. Since $\det A = \lambda_1 \cdots \lambda_n$, we have $|\det A| =1$ and it follows that $A= (\det A) U$ is unitary.
Restating the (Euclidean) norm identity:
$$ ||Ax||^2 = det(A)^2 ||x||^2 \;\;\; (*) $$
for all $x \in \mathbb{R}^n$. Since $det(A)^2$ is constant (independent of $x$), familiarity with Rayleigh quotients might lead one to conclude that $A^T A$ has eigenvalue $det(A)^2$ of (geometric) multiplicity $n$.
But we can prove $A^T A = det(A)^2 I$ with a couple of brief computations.
Let $e_i$ be the standard basis vector of $\mathbb{R}^n$ whose $i^{th}$ component is $1$. The diagonal entry $(A^T A)_{ii} = e_i^T (A^T A)e_i$ is then $det(A)^2$:
$$ ||Ae_i||^2 = det(A)^2 ||e_i||^2 = det(A)^2 $$
It remains to show any off-diagonal entries of $A^T A$ are zero. Suppose $i \neq j$. On one hand:
$$||A(e_i+e_j)||^2 = det(A)^2 ||e_i+e_j||^2 = 2 det(A)^2 $$
On the other hand expanding the "inner product" form:
$$ (A(e_i+e_j))^T A(e_i+e_j) = e_i^T(A^T A)e_i + 2 e_i^T(A^T A)e_j + e_j^T(A^T A)e_j $$
$$ ||A(e_i+e_j)||^2 = 2 det(A)^2 + 2 (A^T A)_{ij} $$
implying that $(A^T A)_{ij}$ is zero. $\; \therefore \; A^T A = det(A)^2 I$ .
Taking determinants of both sides:
$$ det(A)^2 = det(A)^{2n} $$
If dimension $n=1$ this doesn't place any restriction on $A$, and indeed every "linear transformation" $T: \mathbb{R} \to \mathbb{R}$ satisfies the norm identity $(*)$.
But if $n \gt 1$ this implies either $det(A)^2 = 0$ or $1$, resp. that $A$ is zero or orthogonal (since $A^T A = I$).
The converse is easy to see, that norm identity $(*)$ holds if $A$ is zero or orthogonal.