Let $K$ be a field with $char(K) \neq 2$. Let $0 \neq V$ be a $K$-vectorspace with $dim(V)< \infty$ and let $B: V \times V \to K$ be a nondegenerate bilinearform. Consider the subspace $W = \{f \in End(V) : B(f(x),y))+B(x,f(y))=0$ for all $x,y \in V \} \subset End(V)$.
My question is if there are always $f,g \in W$ with $f \circ g \neq 0$ (an idempotent $f \neq 0$ would therefore also be enough). This property seems so general that it should be obvious , but except for $0$ I don't know how a single element of $W$ looks like.
2026-03-29 20:21:46.1774815706
Find elements $f,g$ of the orthogonal Lie_Algebra with $f \circ g \neq 0$
79 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in VECTOR-SPACES
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Does curl vector influence the final destination of a particle?
- Closure and Subsets of Normed Vector Spaces
- Dimension of solution space of homogeneous differential equation, proof
- Linear Algebra and Vector spaces
- Is the professor wrong? Simple ODE question
- Finding subspaces with trivial intersection
- verifying V is a vector space
- Proving something is a vector space using pre-defined properties
- Subspace of vector spaces
Related Questions in LIE-ALGEBRAS
- Holonomy bundle is a covering space
- Computing the logarithm of an exponentiated matrix?
- Need help with notation. Is this lower dot an operation?
- On uniparametric subgroups of a Lie group
- Are there special advantages in this representation of sl2?
- $SU(2)$ adjoint and fundamental transformations
- Radical of Der(L) where L is a Lie Algebra
- $SU(3)$ irreps decomposition in subgroup irreps
- Given a representation $\phi: L \rightarrow \mathfrak {gl}(V)$ $\phi(L)$ in End $V$ leaves invariant precisely the same subspaces as $L$.
- Tensors transformations under $so(4)$
Related Questions in ORTHOGONALITY
- Functions on $\mathbb{R}^n$ commuting with orthogonal transformations
- Proving set of orthogonal vectors is linearly indpendent
- Find all vectors $v = (x,y,z)$ orthogonal to both $u_1$ and $u_2$.
- Calculus III Vector distance problem.
- Is there a matrix which is not orthogonal but only has A transpose A equal to identity?
- Number of Orthogonal vectors
- Find the dimension of a subspace and the orthogonality complement of another
- Forming an orthonormal basis with these independent vectors
- orthogonal complement - incorrect Brézis definition
- Orthogonal Projection in Inner Product
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I assume we are talking about a symmetric bilinear form $B$ here.
Let $n := \mathrm{dim}(V)$. For $n=1$, the space $W$ is reduced to $\{0\}$, but for $n \ge 2$, indeed there exist $f \in W$ with $f\circ f \neq 0$. Much stronger statements can be made with using scalar extension and classification of semisimple Lie algebras (at least if $\mathrm{char}(K)=0$), but here is a rather elementary proof for the claim:
First, it is well known that one can "diagonalise" the symmetric bilinear form (reference for linear algebra books that teach reverse Hermite method for symmetric matrices, Bilinear Form Diagonalisation, How to diagonalize $f(x,y,z)=xy+yz+xz$). That means there is a basis $v_1, ..., v_n$ of $V$ such that $B(v_i, v_j) =0$ for $i\neq j$ and further, if the form is non-degenerate, that all $a_i :=B(v_i, v_i)\neq 0$.
Now define $f\in \mathrm{End}(V)$ as follows:
$$f(v_i):= \begin{cases} v_2 \qquad \; \;\text{ if } i=1\\ \frac{-a_2}{a_1}v_1 \quad \text{ if } i=2 \\0 \qquad \quad\text{ if } i \ge 3\end{cases}$$
Check that $f$ is in $W$, and obviously $(f\circ f) (v_1) = -\frac{a_2}{a_1} v_1 \neq 0$.
Note that in the special case that $B$ is the standard "scalar product" and $v_i$ an orthonormal basis, $W$ consists of all skew-symmetric matrices and the above $f$ corresponds to the matrix
$$\pmatrix{0 &-1& &\\\ 1&0& &\\ &&&\Large{0}\\}$$
which was my motivating example for the general construction.