A question on tensor product of $C^{*}$ algebras

216 Views Asked by At

Let $A$ and $B$ be two $C^{*}$ algebras. Assume that every element of the minimal tensor product $A\otimes_{min} B$ is a finite linear combination of simple tensors $a\otimes b$. Can we say that $A$ or $B$ is a finite dimensional $C^{*}$ algebra?

1

There are 1 best solutions below

3
On BEST ANSWER

Yes we can. Sketch of proof: Take faithful representations of $A$ and $B$ on Hilbert spaces $H_1$ and $H_2$, this identifying $A$ and $B$ with subalgebras of $\mathcal{B}(H_1)$ and $\mathcal{B}(H_2)$, respectively; then naturally $A\otimes_{min}B$ is a $C^*$-subalgebra of $\mathcal{B}(H_1\otimes H_2)$ and the norm on the minimal (spatial) tensor product is just the restriction to $A\otimes_{min}B$ of the operator norm on $\mathcal{B}(H_1\otimes H_2)$.

Assume that both $A$ and $B$ are infinite-dimensional. Then there exists a vector $x\in H_1$ and a sequence of elements $a_j\in A$, $j=1,2,\dots$ such that the vectors $e_j=a_jx$ are linearly independent (why? see at the end of the answer); using the Gram--Schmidt process we can assume that they are pairwise orthogonal. In a similar way, there exists a vector $y\in H_2$ and a sequence of elements $b_j\in B$, $j=1,2,\dots$ such that the vectors $e_j=b_jx$ are pairwise orthogonal. If a sequence $\lambda_j$ of positive numbers converges to zero sufficiently rapidly, then $$ C:=\sum_{j=1}^\infty \lambda_ja_j\otimes b_j\in A\otimes_{min}B. $$ Assume that $C$ can be represented as a finite sum $$ C:=\sum_{k=1}^N c_k\otimes d_k. $$ Consider the element $$ z=C(x\otimes y)\in H_1\otimes H_2, \quad z=\sum_{j=1}^\infty\lambda_j e_j\otimes f_j=\sum_{k=1}^N(c_kx)\otimes (d_ky). $$ Consider the antilinear mapping $Z:H_2\to H_1$ defined by the formula $\langle Zu,v\rangle:=\langle z,v\otimes u\rangle$. Its range is contained in the subspace spanned by the $N$ vectors $c_kx$ and at the same time contains infinitely many vectors $e_j$, which are pairwise orthogonal. This is a contradiction, which proves the theorem.

Edited: ----- Let us now answer the "why" question ----

The simplest (but by no means necessary) way to guarantee the existence of $x$ is as follows. (The method given below is fairly standard in $K$-theory of $C^*$ algebras.) We will not bother ourselves with proving that this is true for any faithful representation---we just do not need that. A representation of special form will do. Namely, tаке $H_1$ to be the sum of countably many copies of a faithful representation of $A$, $H_1=\oplus_{j=1}^\infty H$. Thus, the elements of $H_1$ are infinite sequences $x=(x_1,\dots,x_n,\dots)$ of elements of $H$ such that $\sum||x_j||^2<\infty$, and $ax=(ax_1,\dots,ax_n,\dots)$. This representation is again faithful. Take a linearly independent sequence $\{a_j\}$ of elements of $A$; this is possible, because $A$ is infinite-dimensional by assumption. We will construct $x$ by successively determining the components $x_j$. First, take $x_1\in H$ such that $a_1x_1\ne0$. Next, if $a_1x_1$ and $a_2x_1$ are linearly independent, set $x_2=0$. If they are linearly dependent, then necessarily $a_2x_1=\mu a_1x_1$ for some $\mu\in\mathbb{C}$ (because $a_1x_1\ne0$). There exists a nonzero $x_2$ such that $a_2x_2\ne\mu a_1x_1$ (otherwise we would have $a_2=\mu a_1$). Then the vectors $a_1(x_1,x_2,0,0\dots)=(a_1x_1,a_1x_2,0,0,\dots)$ and $a_2(x_1,x_2,0,0\dots)=(a_2x_1,a_2x_2,0,0,\dots)$ are linearly independent. By continuing this process by induction (I am sure you can continue it yourself) and by normalizing the $x_j$ so that $||x_j||\le 2^{-j}$, we obtain the desired vector $x=(x_1,x_2,\dots,x_n,\dots)$.