Let $A$ and $B$ be two $C^{*}$ algebras. Assume that every element of the minimal tensor product $A\otimes_{min} B$ is a finite linear combination of simple tensors $a\otimes b$. Can we say that $A$ or $B$ is a finite dimensional $C^{*}$ algebra?
2026-03-29 07:38:55.1774769935
A question on tensor product of $C^{*}$ algebras
216 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in OPERATOR-THEORY
- $\| (I-T)^{-1}|_{\ker(I-T)^\perp} \| \geq 1$ for all compact operator $T$ in an infinite dimensional Hilbert space
- Confusion about relationship between operator $K$-theory and topological $K$-theory
- Definition of matrix valued smooth function
- hyponormal operators
- a positive matrix of operators
- If $S=(S_1,S_2)$ hyponormal, why $S_1$ and $S_2$ are hyponormal?
- Closed kernel of a operator.
- Why is $\lambda\mapsto(\lambda\textbf{1}-T)^{-1}$ analytic on $\rho(T)$?
- Show that a sequence of operators converges strongly to $I$ but not by norm.
- Is the dot product a symmetric or anti-symmetric operator?
Related Questions in OPERATOR-ALGEBRAS
- Bijection between $\Delta(A)$ and $\mathrm{Max}(A)$
- hyponormal operators
- Cuntz-Krieger algebra as crossed product
- Identifying $C(X\times X)$ with $C(X)\otimes C(X)$
- If $A\in\mathcal{L}(E)$, why $\lim\limits_{n\to+\infty}\|A^n\|^{1/n}$ always exists?
- Given two projections $p,q$ in a C$^{*}$-algebra $E$, find all irreducible representations of $C^{*}(p,q)$
- projective and Haagerup tensor norms
- AF-algebras and K-theory
- How to show range of a projection is an eigenspace.
- Is $\left\lVert f_U-f_V\right\rVert_{op}\leq \left\lVert U-V\right\rVert_2$ where $f_U = A\mapsto UAU^*$?
Related Questions in C-STAR-ALGEBRAS
- Cuntz-Krieger algebra as crossed product
- Given two projections $p,q$ in a C$^{*}$-algebra $E$, find all irreducible representations of $C^{*}(p,q)$
- AF-algebras and K-theory
- How to show range of a projection is an eigenspace.
- Is a $*$-representation $\pi:A\to B(H)$ non-degenerate iff $\overline{\pi(A) B(H)} = B(H)$?
- Spectral theorem for inductive limits of $C^*$-Algebras
- Examples of unbounded approximate units in $C^*$-algebras
- Is there a way to describe these compactifications algebraically?
- Projections in C*-algebras
- Homogeneous C*-algebras
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Yes we can. Sketch of proof: Take faithful representations of $A$ and $B$ on Hilbert spaces $H_1$ and $H_2$, this identifying $A$ and $B$ with subalgebras of $\mathcal{B}(H_1)$ and $\mathcal{B}(H_2)$, respectively; then naturally $A\otimes_{min}B$ is a $C^*$-subalgebra of $\mathcal{B}(H_1\otimes H_2)$ and the norm on the minimal (spatial) tensor product is just the restriction to $A\otimes_{min}B$ of the operator norm on $\mathcal{B}(H_1\otimes H_2)$.
Assume that both $A$ and $B$ are infinite-dimensional. Then there exists a vector $x\in H_1$ and a sequence of elements $a_j\in A$, $j=1,2,\dots$ such that the vectors $e_j=a_jx$ are linearly independent (why? see at the end of the answer); using the Gram--Schmidt process we can assume that they are pairwise orthogonal. In a similar way, there exists a vector $y\in H_2$ and a sequence of elements $b_j\in B$, $j=1,2,\dots$ such that the vectors $e_j=b_jx$ are pairwise orthogonal. If a sequence $\lambda_j$ of positive numbers converges to zero sufficiently rapidly, then $$ C:=\sum_{j=1}^\infty \lambda_ja_j\otimes b_j\in A\otimes_{min}B. $$ Assume that $C$ can be represented as a finite sum $$ C:=\sum_{k=1}^N c_k\otimes d_k. $$ Consider the element $$ z=C(x\otimes y)\in H_1\otimes H_2, \quad z=\sum_{j=1}^\infty\lambda_j e_j\otimes f_j=\sum_{k=1}^N(c_kx)\otimes (d_ky). $$ Consider the antilinear mapping $Z:H_2\to H_1$ defined by the formula $\langle Zu,v\rangle:=\langle z,v\otimes u\rangle$. Its range is contained in the subspace spanned by the $N$ vectors $c_kx$ and at the same time contains infinitely many vectors $e_j$, which are pairwise orthogonal. This is a contradiction, which proves the theorem.
Edited: ----- Let us now answer the "why" question ----
The simplest (but by no means necessary) way to guarantee the existence of $x$ is as follows. (The method given below is fairly standard in $K$-theory of $C^*$ algebras.) We will not bother ourselves with proving that this is true for any faithful representation---we just do not need that. A representation of special form will do. Namely, tаке $H_1$ to be the sum of countably many copies of a faithful representation of $A$, $H_1=\oplus_{j=1}^\infty H$. Thus, the elements of $H_1$ are infinite sequences $x=(x_1,\dots,x_n,\dots)$ of elements of $H$ such that $\sum||x_j||^2<\infty$, and $ax=(ax_1,\dots,ax_n,\dots)$. This representation is again faithful. Take a linearly independent sequence $\{a_j\}$ of elements of $A$; this is possible, because $A$ is infinite-dimensional by assumption. We will construct $x$ by successively determining the components $x_j$. First, take $x_1\in H$ such that $a_1x_1\ne0$. Next, if $a_1x_1$ and $a_2x_1$ are linearly independent, set $x_2=0$. If they are linearly dependent, then necessarily $a_2x_1=\mu a_1x_1$ for some $\mu\in\mathbb{C}$ (because $a_1x_1\ne0$). There exists a nonzero $x_2$ such that $a_2x_2\ne\mu a_1x_1$ (otherwise we would have $a_2=\mu a_1$). Then the vectors $a_1(x_1,x_2,0,0\dots)=(a_1x_1,a_1x_2,0,0,\dots)$ and $a_2(x_1,x_2,0,0\dots)=(a_2x_1,a_2x_2,0,0,\dots)$ are linearly independent. By continuing this process by induction (I am sure you can continue it yourself) and by normalizing the $x_j$ so that $||x_j||\le 2^{-j}$, we obtain the desired vector $x=(x_1,x_2,\dots,x_n,\dots)$.