How do operators of operators work?

65 Views Asked by At

$u,v \in V$

$M,N \in O\equiv$ {Linear operators on $V$}

$A,B \in O^2\equiv$ {Linear operators on $O$}

  1. Does $O^2$ add anything new? Is it just isomorphic to $O$?

$O$ and $O^2$ are both associative algebras, but are they still associative when “combined”?:

To combine them, you need $Av$ to make sense, but $A$ only acts on $O$. To fix this, make note that $Av=AI_O v$. With this, $Av\equiv A(I_O)v$. So $A$ acting on $v$ is shorthand for $A$ acting on $I_O$ which then acts on $v$.

  1. Associativity(in part) now is if $(AB)v=A(Bv)$. This is equivalent to if $A(I_O)B(I_O)=(AB)I_O$.

  2. The second part is if $A(MNv)=A(M)Nv$. Does altering the operator before applying it have the same effect as just going in order? This is equivalent to $AMv=A(I_O)Mv$

  3. Are there any interesting facts about this combination beyond what I’ve thought to ask about?

  4. Is there anything new when considering combinations of $O^n\equiv $ {Linear operators on $O^{n-1}$}?

Feel free to tell me if any of this is confusing and to ask for clarification.

1

There are 1 best solutions below

0
On BEST ANSWER

Linear operators on linear operators are common feature in Operator Theory/Operator Algebras. But you want to force the requirement that these meta-operators also act on the base space, which is not natural at all, as far as I can tell. Forcing an action on the base space by acting on a fixed operator seems unnatural and restricting, as you can see from applying said idea to common linear operators on linear operators.

Some examples of common linear maps on operators:

  • Ever present in Matrix Analysis, branches of Linear Algebra, and Quantum Information is the trace: given $A\in M_n(\mathbb C)$, $$ \operatorname{Tr}(A)=\sum_j\lambda_{j}, $$ where $\lambda_1,\ldots,\lambda_n$ are the eigenvalues of $A$. Equivalently, add over the diagonal of $A$.

  • In Matrix Analysis it is common to use pinchings, where you map $$ M\longmapsto \begin{bmatrix} M_{11}\\ &\ddots\\ && M_{nn}\end{bmatrix}, $$ with zeroes off-diagonal. One can do a weighted version of this, too.

  • Very common in Operator Theory, Operator Algebras, and Quantum Information are completely positive maps, which in general are maps $\varphi:S\to T$, where $S\subset B(H)$ and $T\subset B(K)$ are operator systems (in particular, C$^*$-algebras) that are positive (map positive semidefinite operators to positive semidefinite operators) not only from $S$ to $T$ but also from $M_n(S)$ to $M_n(T)$ for all $n\in \mathbb N$. The notion of positivitiy on $M_n(S)$ is obtained by considering $M_n(S)$ are operators on $H^n$ via the usual matrix multiplication.

Using examples it is easy to see that your idea does not work in general. For instance consider $V=\mathbb C^2$, $$ M=\begin{bmatrix} 1&2\\3&4\end{bmatrix},\qquad\qquad N=\begin{bmatrix} 5&6\\7&8\end{bmatrix}, $$ and $A, B$ the pinchings $$ \begin{bmatrix} a&b\\ c&d\end{bmatrix} =\begin{bmatrix} 2a&0\\0&b\end{bmatrix},\qquad\qquad \begin{bmatrix} a&b\\ c&d\end{bmatrix} =\begin{bmatrix} 3a&0\\0&b\end{bmatrix}. $$ Then $$ A(M)=\begin{bmatrix} 2&0\\ 0&4\end{bmatrix},\qquad\qquad A(I_n)M=\begin{bmatrix} 2&0\\0 &1\end{bmatrix}\begin{bmatrix} 1&2\\3&4\end{bmatrix}=\begin{bmatrix} 2&4\\3&4\end{bmatrix}. $$