How can I solve the following exercise?
Let $\textrm{o}\left ( n \right )=\left \{ A\in M\left ( n,\mathbb{R} \right ); A^{t}=-A\right \}$. For each $A\in \textrm{o}\left ( n \right )$ define the vector field on $M\left ( n,\mathbb{R} \right )$ expressed in the coordinates $x=\left ( x_{ij} \right )$ by
$X_{A}\left ( x \right )=\sum_{ij}^{\, }\left ( xA \right )_{ij}\frac{\partial }{\partial x_{ij}}$
Show that $\left [ X_{A} , X_{B}\right ]=X_{\left [ A,B \right ]}$ for all $A,B\in \textrm{o}\left ( n \right )$. Here $\left [ A,B \right ]=AB-BA$ is the matrix commutator.
I have some questions
A vector field is an application $X:U\subset \mathbb{R}^{n}\rightarrow \mathbb{R}^{n}$. How is a vector field defined on $M\left ( n,\mathbb{R} \right )$?
I can not understand the definition $X_{A}\left ( x \right )$ well under the definition of vector field.
How does the antisymmetry of the matrix intervene in the solution of the exercise?
I would like you to give me some explanation on these questions apart from the solution to the exercise.
Thanks for your help
(1) $M(n, \Bbb R) \equiv \Bbb R^{n^2}$, so $X$ is a vector field $\Bbb R^{n^2} \to \Bbb R^{n^2}$. Or you can think of it as $X : M(n, \Bbb R) \to M(n, \Bbb R)$. That is, the vectors can also be thought of as matrices.
(2) How to answer this really depends on what your background is. In particular, you said that a vector field is a map $U \subset \Bbb R^n \to \Bbb R^n$. But for a differential geometer (where things "Lie" are normally considered to live), this is not true. Instead, a vector field is a map from a manifold $Q \to TQ$, the tangent space to $Q$. The definition you give comes from restricting $Q$ to subsets of $\Bbb R^n$, and making use of the natural identification of the tangent space $T\Bbb R^n$ with $\Bbb R^n \times \Bbb R^n$.
If you are not familiar with manifolds and the tangent space, this would require a very large amount of explanation that is best left to textbooks, not internet forums. The short version is: If you have a basis $v_i$ for $\Bbb R^n$, you can define a coordinate system $(x_i)$ representing a point $x$ by $x = \sum x_iv_i$. This coordinate system also defines differential operators $\left\{\frac {\partial\,\,\,}{\partial x_i}\right\}$. In differential geometry, the operator $\frac {\partial\,\,\,}{\partial x_i}$ is identified with the basis vector $v_i$. For reasons I won't get into, the two are considered to be the same object.
This definition is most easily understood if you consider the vectors to be matrices. The coordinate system is simply the list of matrix elements. The basis vectors $e_{ij}$ are the matrices with $1$ as the $ij$ element and all other entries $0$. $x$ is therefore a matrix in $M(n,\Bbb R)$ and by the indentification of $e_{ij}$ with $\frac {\partial\,\,\,}{\partial x_{ij}}$, the definition of $X_A(x)$ is that it is the matrix whose $ij$ element is the $ij$ element of the matrix product $xA$. I.e. $$X_A(x) = xA$$ This looks like a case of mathematical obfuscation. But actually, it is very important to understand $X_A$ as a differential operator field, not just as a product of matrices. This is because the Lie bracket is defined by composition of those differential operators. For a function $f$, $$[X_A, X_B](f) = X_A(X_B(f)) - X_B(X_A(f))$$ That is, $X_B(f)$ is a new function obtained by applying the differential operator $X_B$ to the function $f$. Then $X_A(X_B(f))$ is the function that results from applying the differential operator $X_A$ to the function $X_B(f)$. And similarly with $X_B(X_A(f))$. Expanding we see that
$$X_B(f) = \sum_{i,j} (xB)_{ij}\frac{\partial f}{\partial x_{ij}}= \sum_{i,j,k} x_{ik}B_{kj}\frac{\partial f}{\partial x_{ij}}$$ and$$\begin{align}X_A(X_B(f)) &= \sum_{p,q,r} x_{pr}A_{rq}\frac{\partial}{\partial x_{pq}}\left(\sum_{i,j,k} x_{ik}B_{kj}\frac{\partial f}{\partial x_{ij}}\right)\\ &=\sum_{p,q,r}\sum_{i,j,k} x_{pr}A_{rq}B_{kj}\left(\frac{\partial x_{ik}}{\partial x_{pq}}\frac{\partial f}{\partial x_{ij}}+ x_{ik}\frac{\partial^2 f}{\partial x_{pq}\partial x_{ij}}\right)\\&= \sum_{i,j,k,r} x_{ir}A_{rk}B_{kj}\frac{\partial f}{\partial x_{ij}}+\sum_{p,q,r}\sum_{i,j,k} x_{pr}x_{ik}A_{rq}B_{kj}\frac{\partial^2 f}{\partial x_{pq}\partial x_{ij}}\end{align}$$
Can you take it from there?
(3) I don't see that it is required at all. It is possible that some later problem develops from this one and makes use of the antisymmetry. That is the only reason I can think of that would explain why this is limited to antisymmetric matrices.