I am looking for a concrete example for expressions like $$ V_A\otimes V_B = V_C\oplus V_D $$ that shows explicitly what the basis elements actually look like. My attempt was the following, lets take $V_A$ as subspace of $\mathbb R^3$, let it be the 1 dimensional space that spans the "x-line", with basis $B_A=\{e_x\}$. Let $V_B$ be the y-z plane with basis $B_B=\{e_y,e_z\}$. Let the new product space be $$ V_{AB} = V_A\otimes V_B $$ The basis for this new space should be two dimensional and is obtained by taking all possible tensor products of the composing vector space, $$ B_{AB} = \{e_x\otimes e_y, \ e_x\otimes e_z \} $$ I hope everything is correct up to this point. Now comes the part that I don't know how to do, the decomposition of this two dimensional space into two 1D spaces. Can we write $$ V_{AB} = V_{I} \oplus V_{II} $$ where $V_I, V_{II}$ are each 1D spaces ? If so, how do the bases look like ?
2026-03-28 16:28:28.1774715308
Addition and Tensor product of Vector spaces for beginners : Concrete example
124 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in VECTOR-SPACES
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Does curl vector influence the final destination of a particle?
- Closure and Subsets of Normed Vector Spaces
- Dimension of solution space of homogeneous differential equation, proof
- Linear Algebra and Vector spaces
- Is the professor wrong? Simple ODE question
- Finding subspaces with trivial intersection
- verifying V is a vector space
- Proving something is a vector space using pre-defined properties
- Subspace of vector spaces
Related Questions in TENSOR-PRODUCTS
- Tensor product commutes with infinite products
- Inclusions in tensor products
- How to prove that $f\otimes g: V\otimes W\to X\otimes Y$ is a monomorphism
- What does a direct sum of tensor products look like?
- Tensors transformations under $so(4)$
- Tensor modules of tensor algebras
- projective and Haagerup tensor norms
- Algebraic Tensor product of Hilbert spaces
- Why $\displaystyle\lim_{n\to+\infty}x_n\otimes y_n=x\otimes y\;?$
- Proposition 3.7 in Atiyah-Macdonald (Tensor product of fractions is fraction of tensor product)
Related Questions in DIRECT-SUM
- Finding subspaces with trivial intersection
- Direct sum and the inclusion property
- direct sum of injective hull of two modules is equal to the injective hull of direct sum of those modules
- What does a direct sum of tensor products look like?
- does the direct sum of constant sequences and null sequences gives convergent sequence Vector space
- Existence of Subspace so direct sum gives the orignal vector space.
- A matrix has $n$ independent eigenvectors $\Rightarrow\Bbb R^n$ is the direct sum of the eigenspaces
- $\dim(\mathbb{V}_1 \oplus ...\oplus \mathbb{V}_k) = \dim\mathbb{V}_1+...+\dim\mathbb{V}_k$
- Product/coproduct properties: If $N_1\simeq N_2$ in some category, then $N_1\times N_3\simeq N_2\times N_3$?
- Direct Sums of Abelian Groups/$R$-Modules
Related Questions in DIRECT-PRODUCT
- Krull dimension of a direct product of rings
- Is the map $G*H \to G \times H$ injective?
- Is free product of groups always bigger that direct product?
- In GAP, How can I check whether a given group is a direct product?
- $V^*$ is isomorphic to the direct product of copies of $F$ indexed by $A$
- Prove that $\mathbb{Z}_{5}[x]/(x^2+1)$ is isomorphic to $\mathbb{Z}_{5} \times \mathbb{Z}_{5}$.
- Subdirect products
- If $(g,h)\in G\times H$ with $|g|=r$ and $|h|=s$, then $|(g,h)|=\operatorname{lcm}(r,s)$.
- Using Direct Proofs in Discrete Math
- Symmetric Direct Product Distributive?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
This is a response to the OP's comment, which is too long for a comment.
Conventions
The two objects denoted by $\oplus$ are sometimes called the "internal" and "external" direct sums. External direct sums always make sense for any two vector spaces, and elements are literally ordered pairs whose elements come from the summands. Internal direct sums require that the summands both live in a common ambient vector space, and their elements are literally vectors in that ambient space. In case the internal sum makes sense, one can prove that the map $V\oplus_{ext} W \to V\oplus_{int} W$ sending $(v,w)$ to $v+w$ is an isomorphism (and is the "best" kind of isomorphism in any sense you might mean that, e.g. functorial), so they are for all intents and purposes the same object.
In vector space decompositions such as yours, it is extremely common to use the symbol "=" to mean "isomorphic (in the best needed way)". This abuse of notation is very well-justified in practice, e.g. I may want to construct the tensor product as a set of matrices instead of writing down an abstract basis as you've done, and it's silly to let this "linguistic" difference get in the way.
But for the purposes of this question, it's clear that you mean we should both agree that $V\otimes W$ means $\text{span}_{\Bbb R} \{v\otimes w:v\in V, w\in W\}$, and that you mean "=" to mean "literally equal as sets". In this case, we must use the internal direct sum, since the left-hand side is not constructed set-theoretically as a direct sum (unless we have a very strange construction of the external direct sum).
Since you have constructed the ambient vector space $V_{AB}$, in which both $V_I$ and $V_{I\!I}$ live, this is not a problem. We simply need to find two subspaces of $V_{AB}$ with trivial intersection that span the space.
Construction
Literally speaking, $$ V_{AB} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) + b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes \begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a,b\in \Bbb{R}\right\}$$
Thus, one possible choice for $V_I$ and $V_{I\!I}$ would be $$ V_{I} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) : a\in \Bbb{R}\right\}$$ $$ V_{I\!I} = \left\{ b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : b\in \Bbb{R}\right\}.$$
The natural bases for these spaces are the obvious ones: just remove the remove the coefficients.
This is of course not the only choice*, but to address the other question in your comment, it is not even necessary that $V_I$ has dimension 1. It could just as easily be the zero subspace or the full $V_{AB}$ (leaving $V_{I\!I}$ to be the other one). However, because this is "boring", it is sometimes called the trivial direct sum decomposition. So in that sense, the answer to your question is yes: in your example, all nontrivial decompositions will have both summands of dimension 1.
* I say "of course" in the sense that there is the usual freedom that one has in (direct) sum constructions. For instance, a different choice would be $V_{I\!I}$ as before, but $$ V_{I} = \left\{ 3a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) - 2a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a\in \Bbb{R}\right\},$$ and other such things.