Let $n \ge 3$, and let $C$ be the convex cone generated by the squares of all real $n \times n$ skew-symmetric matrices. Is $C$ closed in $\mathbb{R}^{n^2}$? What is its boundary?
$C$ is a strictly contained in the cone of negative semi-definite matrices.
I know that the set of the squares of all real $n \times n$ skew-symmetric matrices is closed, but in general, a convex cone which is generated by a closed linear cone does not have to be closed.
Edit:
Let's focus on the case where $n=3$, and try to describe $C$ more explicitly.
Let $T:\mathbb{R}^3 \to \mathbb{R}^3$ be a skew-symmetric operator on $\mathbb{R}^3$. $T$ has the form of $T(x) = v \times x$ for some $v \in \mathbb{R}^3$, where $\times$ is denotes the cross product.
The triple vector product implies that
$$ T^2(x)=v \times (v \times x)=\langle v,x \rangle v-\langle v,v \rangle x. $$
Thus, $C$ is the set of all operators $\mathbb{R}^3 \to \mathbb{R}^3$ of the form
$$ x \to \sum_i \langle v_i,x \rangle v_i-|v_i|^2 x.$$
I am not sure if this observation really advance us...
I will prove in the below that $C$ is identical to $$ L=\left\{S: S\preceq0,\ 2\lambda_\min(S)\ge\operatorname{tr}(S)\right\}. $$ Note that the conditions $S\preceq0$ and $2\lambda_\min(S)\ge\operatorname{tr}(S)$ together imply that any $S\in L$ cannot be a rank one matrix, and when $S$ has rank two, its two nonzero eigenvalues must be the same. It follows that if $S\in L$ has rank $\le2$, it must be the square of some skew-symmetric matrix.
If $C$ is indeed equal to $L$, then $C$ is closed in the cone of negative semidefinite matrices, in the set $\mathcal S$ of all symmetric matrices, or in $M_n(\mathbb R)$, and
To prove the inclusion $C\subseteq L$ is easy. The square of every (possibly zero) skew-symmetric matrix is clearly a member of $L$. It follows that every sum of squares of skew-symmetric matrices $S=K_1^2+\cdots+K_m^2$ is a member of $L$ too, because $$ 2\lambda_\min(S) =2\lambda_\min\left(\sum_iK_i^2\right) \ge2\sum_i\lambda_\min\left(K_i^2\right) \ge\sum_i\operatorname{tr}\left(K^2\right) =\operatorname{tr}(S). $$
To prove $L\subseteq C$, we prove by mathematical induction that $$ L^{(r)}=\left\{S\in L: \operatorname{rank}(S)\le r\right\}\subseteq C $$ for $r=2,3,\ldots$. The base case $r=2$ is true because every $S\in L^{(2)}$ is necessarily the square of some (possibly zero) skew-symmetric matrix. In the induction step, suppose $r\ge3$ and $L^{(r-1)}\subseteq C$. Pick any $S\in L^{(r)}$. By orthogonal diagonalisation, we may assume that $S=\operatorname{diag}(s_1,s_2,\ldots,s_r,0,\ldots,0)$ where $s_1\le s_2\le\cdots\le s_r<0$. The condition $2\lambda_\min(S)\ge\operatorname{tr}(S)$ is equivalent to that $s_1\ge\sum_{i>1}s_i$. There are three cases and we will prove in each case that $S\in C$: