I have read in Integrable Hamiltonian Systems on Complex Lie Groups by Velimir Jurdjevic p. 61 that the complex orthogonal Lie algebra of dimension 4, $\mathfrak o_4(\mathbb C)$ or $\mathfrak{so}_4(\mathbb C)$, (by which I mean the space of skew-symmetric matrices of size 4) is isomorphic to the Cartesian product of two copies of the special linear Lie algebra of size 2, $\mathfrak{sl}_2(\mathbb C)\times\mathfrak{sl}_2(\mathbb C)$, but I cannot come up with an explicit isomorphism. Can someone please help me find one?
2026-04-07 03:12:56.1775531576
Explicit isomorphism between the four dimensional orthogonal Lie algebra and the direct sum of special linear Lie algebras of dimension 3.
718 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in ABSTRACT-ALGEBRA
- Feel lost in the scheme of the reducibility of polynomials over $\Bbb Z$ or $\Bbb Q$
- Integral Domain and Degree of Polynomials in $R[X]$
- Fixed points of automorphisms of $\mathbb{Q}(\zeta)$
- Group with order $pq$ has subgroups of order $p$ and $q$
- A commutative ring is prime if and only if it is a domain.
- Conjugacy class formula
- Find gcd and invertible elements of a ring.
- Extending a linear action to monomials of higher degree
- polynomial remainder theorem proof, is it legit?
- $(2,1+\sqrt{-5}) \not \cong \mathbb{Z}[\sqrt{-5}]$ as $\mathbb{Z}[\sqrt{-5}]$-module
Related Questions in LIE-ALGEBRAS
- Holonomy bundle is a covering space
- Computing the logarithm of an exponentiated matrix?
- Need help with notation. Is this lower dot an operation?
- On uniparametric subgroups of a Lie group
- Are there special advantages in this representation of sl2?
- $SU(2)$ adjoint and fundamental transformations
- Radical of Der(L) where L is a Lie Algebra
- $SU(3)$ irreps decomposition in subgroup irreps
- Given a representation $\phi: L \rightarrow \mathfrak {gl}(V)$ $\phi(L)$ in End $V$ leaves invariant precisely the same subspaces as $L$.
- Tensors transformations under $so(4)$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I guess (edit: corrected)
is as explicit as you can get. This works over any field whose characteristic is $\neq 2$ and which contains a square root of $-1$, called $i$ in the formula above.
However, assuming it is correct (I leave it to you to check it's a homomorphism and write down the inverse, hoping that all my signs are correct), this shows mainly that such an explicit matrix formula is virtually useless, and one should rather understand what the theory behind it is.
And this goes like this:
Step 1: Assuming a good split form for $\mathfrak{so}_4$, construct an explicit isomorphism. Let's assume we can show that over our field we have an isomorphic representation of $\mathfrak{so}_4$ not as skew-symmetric matrices, but as matrices
$$A = \pmatrix{a&c&d&0\\e&b&0&-d\\f&0&-b&-c\\0&-f&-e&-a}.$$
Nice thing: The diagonal makes up a Cartan subalgebra. We can see two positive roots operating here, $\alpha_1$ which sends the above $A$ to $a-b$ and whose root space is
$$\pmatrix{0&*&0&0\\0&0&0&0\\0&0&0&*\\0&0&0&0},$$
and $\alpha_2$ which sends the above $A$ to $a+b$ and whose root space is
$$\pmatrix{0&0&*&0\\0&0&0&*\\0&0&0&0\\0&0&0&0}.$$
Knowing what we want and that these two roots are orthogonal to each other, we take the diagonal apart via $\pmatrix{a&0\\0&b}=\dfrac12 \pmatrix{a-b&0\\0&b-a} + \dfrac12 \pmatrix{a+b&0\\0&a+b}$ and get the isomorphism
$$ \pmatrix{a&c&d&0\\e&b&0&-d\\f&0&-b&-c\\0&-f&-e&-a} \mapsto \pmatrix{\dfrac12(a-b)&c\\e&\dfrac12(b-a)} \oplus \pmatrix{\dfrac12(a+b)&d\\f&-\dfrac12(a+b)}$$
onto $\mathfrak{sl}_2 \oplus \mathfrak{sl}_2$ almost for free. Or: Note that the triples $$H_1=\pmatrix{1&0&0&0\\0&-1&0&0\\0&0&1&0\\0&0&0&-1}, X_1=\pmatrix{0&1&0&0\\0&0&0&0\\0&0&0&-1\\0&0&0&0} , Y_1=\pmatrix{0&0&0&0\\1&0&0&0\\0&0&0&0\\0&0&-1&0}$$
resp. $$H_2=\pmatrix{1&0&0&0\\0&1&0&0\\0&0&-1&0\\0&0&0&-1}, X_2=\pmatrix{0&0&1&0\\0&0&0&-1\\0&0&0&0\\0&0&0&0} , Y_2=\pmatrix{0&0&0&0\\0&0&0&0\\1&0&0&0\\0&-1&0&0}$$
satisfy the same relations as the standard basis of $\mathfrak{sl}_2$, $$H=\pmatrix{1&0\\0&-1}, X=\pmatrix{0&1\\0&0}, Y=\pmatrix{0&0\\1&0},$$ namely $[H,X]=2X, [H,Y]=-2Y, [X,Y]=H$, and are orthogonal to each other, i.e. $[\ast_1, \ast_2]=0$ for $\ast =H,X,Y$.
Step 2: Base change to that standard split form. Cf. https://math.stackexchange.com/a/3489788/96384. Remember a quadratic form (= symmetric bilinear form) is given by a symmetric $n \times n$-matrix $S$. One can in general define $\mathfrak{so}_S = \{A \in M_n(K): \, ^tA=-SAS^{-1} \}$ and check that is a Lie algebra. Now in general two matrices $S_1, S_2$ might actually describe the same bilinear form, just with respect to different coordinates, i.e. change of basis. Remember that basis change for such forms works by "congruence", i.e. existence of a basis change matrix $P$ such that
$$^tP S_1 P=S_2.$$
Now check that if such a congruence exists, then usual "equivalence" wil define an isomorphism
$$\mathfrak{so}_{S_1} \simeq \mathfrak{so}_{S_2}$$ $$A \mapsto P^{-1}AP$$
(Note: Now it's really the inverse, not the transpose!).
Now you started with the Lie algebra of skew-symmetric matrices which is the base case $S=I_n$. Turns out that written like that, one has a hard time "seeing" a Cartan subalgebra and root spaces in the matrices. So I perform a change of basis. Or rather two: First I want to get from
$S_1 = \pmatrix{1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1} \mapsto S_2 = \pmatrix{1&0&0&0\\0&1&0&0\\0&0&-1&0\\0&0&0&-1}$
i.e. from the quadratic form $w^2+x^2+y^2+z^2$ to $w'^2+x'^2-y'^2-z'^2$. This works in any field with a square root of $-1$ called $i$, namely $w':=w, x':=x, y':=iy, z':=iz$ i.e.
$P_1= \pmatrix{1&0&0&0\\0&1&0&0\\0&0&i&0\\0&0&0&i}.$
Now further I want to go
$S_2 = \pmatrix{1&0&0&0\\0&1&0&0\\0&0&-1&0\\0&0&0&-1} \mapsto S_3 = \frac{1}{2}\pmatrix{0&0&0&1\\0&0&1&0\\0&1&0&0\\1&0&0&0}$
i.e. express $w'^2+x'^2-y'^2-z'^2$ as $w''z''+x''y''$. (Originally I tried to remove that factor of $1/2$, but either then it pops up elsewhere, or one has to scale with ugly numbers like $\sqrt 2$, which would not work over $\mathbb Q$, so I just left it there.) This is a standard base change for hyperbolic space, on the coefficients we have
$$w'':= (w'+z'), x'':=(x'+y'), y'':=(x'-y'), z'':=(w'-z')$$
corresponding to
$$e_1 \mapsto \frac12 (e_1+e_4), e_2 \mapsto \frac12 (e_2+e_3), e_3 \mapsto \frac12 (e_2-e_3), e_4 \mapsto \frac12 (e_1-e_4)$$
or
$P_2= \frac12\pmatrix{1&0&0&1\\0&1&1&0\\0&1&-1&0\\1&0&0&-1}$.
Putting all this together one gets
$$ \pmatrix{0&a&b&c\\ -a&0&d&e\\ -b&-d&0&f\\ -c&-e&-f&0\\ } \xrightarrow{P_1^{-1} (\cdot) P_1} \pmatrix{0&a&ib&ic\\ -a&0&id&ie\\ ib&id&0&f\\ ic&ie&-f&0\\ } \xrightarrow{P_2^{-1} (\cdot) P_2} \pmatrix{2ic&a+ie+ib-f&a+ie-ib+f&0\\ -a+ie+ib+f&2id&0&-a-ie+ib-f\\ -a+ie-ib-f&0&-2id&-a-ie-ib+f\\ 0&a-ie+ib+f&a-ie-ib-f&-2ic\\ }$$
Step 3: Combine steps 1 and 2.