I’m curious about wether every Tensor of second order can be represented as a matrix. The only possibilities are (to my understanding, correct me if i’m wrong): $$\mathcal{T}^{(2;0)}= \mathbf{V\otimes V}$$ $$\mathcal{T}^{(1;1)}= \mathbf{V\otimes V^*}$$ $$\mathcal{T}^{(0;2)}= \mathbf{V^* \otimes V^*}$$ Can all these be represented as $n\times n$ matrices? If so, i would like to know if there is a difference between all these matrices.
2026-04-01 00:38:46.1775003926
Can all rank two tensors be represented as matrices?
78 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in TENSOR-PRODUCTS
- Tensor product commutes with infinite products
- Inclusions in tensor products
- How to prove that $f\otimes g: V\otimes W\to X\otimes Y$ is a monomorphism
- What does a direct sum of tensor products look like?
- Tensors transformations under $so(4)$
- Tensor modules of tensor algebras
- projective and Haagerup tensor norms
- Algebraic Tensor product of Hilbert spaces
- Why $\displaystyle\lim_{n\to+\infty}x_n\otimes y_n=x\otimes y\;?$
- Proposition 3.7 in Atiyah-Macdonald (Tensor product of fractions is fraction of tensor product)
Related Questions in TENSORS
- Linear algebra - Property of an exterior form
- How to show that extension of linear connection commutes with contraction.
- tensor differential equation
- Decomposing an arbitrary rank tensor into components with symmetries
- What is this notation?
- Confusion about vector tensor dot product
- Generalization of chain rule to tensors
- Tensor rank as a first order formula
- $n$-dimensional quadratic equation $(Ax)x + Bx + c = 0$
- What's the best syntax for defining a matrix/tensor via its indices?
Related Questions in MULTILINEAR-ALGEBRA
- How to get the missing brick of the proof $A \circ P_\sigma = P_\sigma \circ A$ using permutations?
- How to prove that $f\otimes g: V\otimes W\to X\otimes Y$ is a monomorphism
- Is the natural norm on the exterior algebra submultiplicative?
- A non-zero quantity associated to an invertible skew-symmetric matrix of even order.
- Silly Question about tensor products and universal property
- Why are bilinear maps represented as members of the tensor space $V^*\otimes V^*$ opposed to just members of the tensor space $V\otimes V$?
- universal property of the $n$-fold tensor product
- If $f:(\mathbb{K}^n)^n \rightarrow \mathbb{K}$ is multilinear and alternating, prove: $f(T(u_1),T(u_2),...,T(u_n)=\det(A)f(u_1,...,u_n)$
- Image of Young symmetrizer on tensor product decomposition
- Proof of $Af = \sum_{\sigma \in S_{k}} (Sgn \sigma) \sigma f$ is an alternating function.
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
At risk of reading too much into the question:
If $V$ is a finite-dimensional real vector space, then upon fixing a basis of $V$ (which determines a unique dual basis of $V^{*}$) every real-valued $2$-tensor $T$ is uniquely determined by a doubly-indexed set of real coefficients, namely the values of $T$ on appropriate basis and/or dual basis vectors. Since any such collection may be written as a matrix, "yes (in this sense)."
That said, a $2$-tensor is not just a set of components in a specific basis, it's a bilinear function, or a set of components in an arbitrary basis. A matrix alone does not capture that. Concretely, a matrix of components alone does not tell us how the components transform under change of basis. That transformation law is an essential datum.
Tangential anecdote: Some years back there was a question on Math.SE about the intersection form of a particular simply-connected $4$-manifold, and an apparent contradiction between using two particular bases for two-dimensonal homology. The resolution was that an intersection form, contrary to OP's understandable matrix-based habit, does not transform by similarity like a linear transformation (tensor in $V^{*} \otimes V$), but like a quadratic form (tensor in $V^{*} \otimes V^{*}$).