I am currently following a course on nonlinear algebra (topics include varieties, elimination, linear spaces, grassmannians etc.). Especially in the exercises we work a lot with skew-symmetric matrices, however, I do not yet understand why they are of such importance.
So my question is: How do skew-symmetric matrices tie in with the topics mentioned above, and also, where else in mathematics would we be interested in them and why?
This is not the area of math you're interested in, but here's an example I might as well write down. In convex optimization we are interested in the canonical form problem $$ \text{minimize} \quad f(x) + g(Ax) $$ where $f$ and $g$ are closed convex proper functions and $A$ is a real $m \times n$ matrix. The optimization variable is $x \in \mathbb R^n$. This canonical form problem is the starting point for the Fenchel-Rockafellar approach to duality.
The KKT optimality conditions for this optimization problem can be written as $$ \tag{$\spadesuit$} 0 \in \begin{bmatrix} 0 & A^T \\ -A & 0 \end{bmatrix} \begin{bmatrix} x \\ z \end{bmatrix} + \begin{bmatrix} \partial f(x) \\ \partial g^*(z) \end{bmatrix}, $$ where $g^*$ is the convex conjugate of $g$ and $\partial f(x)$ is the subdifferential of $f$ at $x$ and $\partial g^*(z)$ is the subdifferential of $g^*$ at $z$. The notation $\begin{bmatrix} \partial f(x) \\ \partial g^*(z) \end{bmatrix}$ denotes the cartesian product $\partial f(x) \times \partial g^*(z)$.
The condition $(\spadesuit)$ is a great example of a "monotone inclusion problem", which is a type of problem that generalizes convex optimization problems. The subdifferential $\partial f$ is the motivating example of a "monotone operator", but the operator $$ \begin{bmatrix} x \\ z \end{bmatrix} \mapsto \begin{bmatrix} 0 & A^T \\ -A & 0 \end{bmatrix}\begin{bmatrix} x \\ z \end{bmatrix} $$ is a good example of a monotone operator which is not the subdifferential of a convex function.