I tried proving that given a square $\mathbf{A}$ over $\mathbb{R}$ so that $\mathbf{A}^\intercal = -\mathbf{A}$, $\mathbf{A}$ is not invertible.
I know that because the matrix is real and its transpose equals minus itsself, all the coefficients on the diagonals must be zero. I suspect you could state that given that $\mathbf{A}$ is square $\mathbf{A}$ is invertible iff its row space is n-dimensional . Also for all $i, j \in \{1,2,..,n\}$ it must be the case that $a_{ij} = -a_{ji}$. So one could probably give a bit of a computational proof trying to write one row as another, and conclude that not all n rows are linearly independent. So therefore $\mathbf{A}$ its row space is not n-dimensional and therefore $\mathbf{A}$ is not invertible.
But i don't really like that approach because it doest really give me some conceptual understanding in why the statement is true. Is there another way of proving this? A proof, or even better some good hints, would be very welcome!
Let $A\in M_n(\Bbb R)$; now $$ A^{T}=-A\Longrightarrow\;\;\det(A^T)=\det(-A) $$ but, since the following two relations \begin{align*} &\bullet\;\;\det(A^T)=\det A\\ &\bullet\;\;\det(-A)=(-1)^{n}\det A \end{align*} are always true (the first follows from this: once you write write a generic square matrix $A=(a_{i,j})_{i,j=1}^n$ its transpose is $A^T=(a_{j,i})_{i,j=1}^n$, just swapped the indexes; then the result follows simply applying the definition of determinant via Laplace Expansion;
the last one comes from the fact the determinant $\det:M_n(\Bbb R)\to\Bbb R$ is an $n$-linear operator), we get immediately $$ \det(A)=(-1)^n\det A $$ which is trivially always true if $n$ is even; if otherwise $n$ is odd, the relation is true iff $\det A=0$.