the positive square root of $I$?

195 Views Asked by At

Find operators $T \colon \mathbb{R}^2 \to \mathbb{R}^2$ such that $T^2=I$. Which one is the positive square root of $I$? Is there the operator $T$ such that $T(T(x))=I$ where $I(x)=x$?

3

There are 3 best solutions below

0
On

Try $T(x,y)=(-x,-y)$, and in general any reflection about the origin.

0
On

There are plenty of solutions to $T\circ T=I$ for linear operators $T$ on a real vector space of dimension$~n$. You always have $T=I$ and $T=-I$ as solution, and also any diagonalisable operator with eigenvalues $1$ and $-1$ (and in addition these are all solutions). For $n=2$ the latter are reflections with respect to some line through the origin, parallel to another line through the origin.

There is no general notion of positivity defined for all linear operators. There is one for symmetric operators (with respect to a given inner product), and the notion clearly excludes eigenvalues$~1$, so the only remaining (symmetric) positive solution would be $T=I$.

0
On

In this answer, I work in terms of $2 \times 2$ real matrices.

Set

$T = \begin{bmatrix} t_{11} & t_{12} \\ t_{21} & t_{22} \end{bmatrix}; \tag{1}$

then

$T^2 = \begin{bmatrix} t_{11} & t_{12} \\ t_{21} & t_{22} \end{bmatrix} \begin{bmatrix} t_{11} & t_{12} \\ t_{21} & t_{22} \end{bmatrix}$ $ = \begin{bmatrix} t_{11}^2 + t_{12}t_{21} & (t_{11} + t_{22}) t_{12} \\ (t_{11} + t_{22}) t_{12} & t_{22}^2 + t_{12}t_{21} \end{bmatrix}; \tag{2}$

setting

$T^2 = I, \tag{3}$

we see that

$t_{11}^2 + t_{12}t_{21} = t_{22}^2 + t_{12}t_{21} = 1 \tag{4}$

and

$ (t_{11} + t_{22}) t_{12} =  (t_{11} + t_{22}) t_{21} = 0. \tag{5}$

It appears from (5) that it might be straightforward to classify solutions to $T^2 = I$ according to $\text{Tr}(T) = t_{11} + t_{22}$; indeed, if $\text{Tr}(T) \ne 0$, then from (5) we have

$t_{12} = t_{21} = 0, \tag{6}$

leaving us by (4) with

$t_{11}^2 = t_{22}^2 = 1. \tag{7}$

From (7),

$t_{11}, t_{22} \in \{1, -1 \}, \tag{8}$

and since $\text{Tr}(T) \ne 0$, we must have

$t_{11} = t_{22} = \pm 1 \tag{9}$

in this case; thus

$T = \pm I. \tag{10}$

In the event that $\text{Tr}(T) = 0$, we have $t_{22} = - t_{11}$; thus we may write

$-t_{22} = t_{11} = \alpha \in \Bbb R; \tag{11}$

this in turn implies, using (4), that

$t_{12}t_{21} = 1 - \alpha^2; \tag{12}$

if $\alpha^2 \ne 1$, (12) shows that $t_{12} \ne 0 \ne t_{21}$,  so if we set

$t_{12} = \beta \in \Bbb R, \;\; \beta \ne 0, \tag{13}$

we have

$t_{21} = \dfrac{1 - \alpha^2}{\beta}; \tag{14}$

now the matrix $T$ looks like this:

$T = \begin{bmatrix} \alpha & \beta \\ \dfrac{1 - \alpha^2}{\beta} & -\alpha \end{bmatrix}; \tag{15}$

when $\alpha^2 = 1$, at least one of $t_{12}$, $t_{21}$ must vanish, leaving us to freely select the other.  In this case, $T$ may be written in one of the forms

$T = \begin{bmatrix} \alpha & \beta \\ 0 & -\alpha \end{bmatrix} \tag{16}$

or

$T = \begin{bmatrix} \alpha & 0 \\ \beta & -\alpha \end{bmatrix}, \tag{17}$

where $\alpha = \pm 1$ and $\beta \in \Bbb R$ is arbitrary.  It is easily checked that $T^2 = I$ for matrices of the form (15)-(17) when the parameters take on the specified values.

Thus, 'twixt (10), (15)-(17), every possible $2 \times 2$ real matrix $T$ with $T^2 = I$ is parametrically presented.

The same method, viz., analyses classifed by the trace, may be used to parametrically determine certain other families of $2 \times 2$ matrices whose square is constrained in some manner or other.  For example, one can let $T^2 = -I$ or $T^2 = T$ and conduct a parallel analysis.  However, I know of no way to extend this technique to matrices of larger size, however; the trace per se does not occur as a factor of the entries of $T^2$ if $\text{size}(T) \ge 3$.

The notion of positivity is technically only applied to symmetric operators, but we can examine $x^T T x$ whrn $T$ takes one of the forms given here and see what we get.  For example, with $T$ as in (15) we find, taking $x = (x_1, x_2)^T$,

$x^T Tx = (x_1, x_2) \begin{bmatrix} \alpha & \beta \\ \dfrac{1 - \alpha^2}{\beta} & -\alpha \end{bmatrix} (x_1, x_2)^T$ $= (x_1, x_2)(\alpha x_1 + \beta x_2, \dfrac{1 - \alpha^2}{\beta} x_1 - \alpha x_2)^T$$ = \alpha x_1^2 + \beta x_1 x_2 + \dfrac{1 - \alpha^2}{\beta} x_1 x_2 - \alpha x_2^2$ $ = \alpha(x_1^2 - x_2^2) + \dfrac{1 - \alpha^2 + \beta^2}{\beta} x_1 x_2. \tag{18}$

From (18) we see that $T$ of the form (15) cannot be positive for any allowed values of our parameters $\alpha, \beta \in \Bbb R$.  For if $\alpha > 0$ we may select $x_1 = 0$ $x_2 \ne 0$ and obtain $x^T T x = -\alpha x_2^2 < 0$; for $\alpha < 0$, we reverse the roles of $x_1$, $x_2$ and set $x_1 \ne 0$, $x_2 = 0$; then $x^T T x = \alpha x_1^2$ and we obtain the same result.  When $\alpha = 0$, we have

$x^T T x = \dfrac{1 + \beta^2}{\beta} x_1 x_2; \tag{19}$

since $1 + \beta^2 > 0$, choosing $x_1$, $x_2$ such that $(x_1 x_2)/\beta < 0$ yields $x^T Tx < 0$.  For $T$ of the form (16) (so that $\alpha^2 = 1$) we find

$x^T T x = (x_1, x_2) \begin{pmatrix} \alpha x_1 + \beta x_2 \\ -\alpha x_2 \end{pmatrix} = \alpha(x_1^2 - x_2^2) + \beta x_1 x_2, \tag{20}$

from which we similarly see that such a $T$ is nonpositive; likewise when $T$ is as in (17).  It thus follows that $T$ can only be positive when $\text{trace}(T) \ne 0$; from what we have seen, this implies $T = I$.

Finally, when may $T$ be symmetric, $T^T = T$?  If $\text{trace}(T) \ne 0$, $T = \pm I$ is always so; when $T$ is as in (16), (17), we see that $T^T = T$ forces $\beta = 0$; for the case (15), where we recall $\alpha^2 \ne 1$, symmetry of $T$ forces

$\dfrac{1 - \alpha^2}{\beta} = \beta \tag{21}$

or

$\alpha^2 + \beta^2 = 1. \tag{22}$

(22) indicates that we may further reduce the number of parameters of $T$ to one via the trigonometric substitution

$\alpha = \cos \theta, \;\; \beta = \sin \theta, \tag{23}$

with $\theta \in [0, 2\pi)$; note that we may allow $\alpha = \cos \theta$ to take on the values $\pm 1$ in this formulation, since we are dealing with the case $t_{12} = t_{21} = \beta$; cases (16), (17) with $\beta \ne 0$ are eliminated by the assumtion that $T^T = T$. As a function of $\theta$, $T$ looks like:

$T(\theta) = \begin{bmatrix} \cos \theta & \sin \theta \\ \sin \theta & -\cos \theta \end{bmatrix}; \tag{24}$

Since $T^T = T$ and $T^2 = I$, we have $T^T T = I$; $T$ is in fact an orthogonal matrix; it is easy to see that $\det(T) = -\cos^2 \theta - \sin^2 \theta = -1$; thus $T(\theta) \in O(2)$, but $T(\theta) \notin SO(2)$. We can also write such $T(\theta)$ as

$T(\theta) = \begin{bmatrix} \cos \theta & \sin \theta \\ \sin \theta & -\cos \theta \end{bmatrix} = R S(\theta), \tag{25}$

where

$R = \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix} \in O(2) \setminus SO(2) \tag{26}$

and

$S(\theta) = \begin{bmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix} \in SO(2); \tag{27}$

$S(\theta)$ is a standard rotation through an angle $\theta$ in the clockwise direction; $R$ ia reflection about the $x$-axis; thus we characterize the symmetric $T$ such that $T^2 = I$.

I believe the above presents a complete classication of real $2 \times 2$ matrices $T$ with $T^2 = I$.