I'm trying to prove that the set $$S=\{A \in \mathscr{L}(\mathbb{R}^N,\mathbb{R}^N) | A^*=A\}$$ ie the symmetric linear maps in $\mathbb{R}^N$ is a vector subspace of $\mathscr{L}(\mathbb{R}^N,\mathbb{R}^N)$ of dimension $d=\frac{n(n+1)}{2}$. I am aware of the intuition that if a matrix is symmetric, the upper triangle is enough to determine it; and there are $d$ elements in that triangle. But I was hoping to arrive at that result in a "matrix free way", and was wondering if anyone had some insight on how to think on those terms.
So, what I did was, using inspiration from the matrix case, I chose a basis for $\mathscr{L}(\mathbb{R}^N,\mathbb{R}^N)$ in the linear maps that assign the $i$-th basis vector the $j$-th basis vector, that is $$\pi_{ij}: \mathbb{R}^N \longrightarrow \mathbb{R}^N \\ \pi_{ij}(e_k)=\delta_k^i e_j$$ But I'm unsure of how to proceed. I've tried handling the $N=2,3$ for matrices and I thought maybe what I had to do was to figure out how to build symmetric basis from the $\pi_{ij}$, in a systematic way for all $N$. But something tells me that probably won't be fruitful. There must be a better way to do this.
To prove $S$ is a subspace, you need to show
Now I'm not sure of your matrix-free definition of "symmetric".
I personally like saying that a transformation $Z$ is symmetric if for all $x, y \in \mathbb R^n$, we have $$ Z(x) \cdot y = x \cdot Z(y) $$ where $\cdot$ is the inner product on your space (in this case, the standard "dot product" on $\mathbb R^n$).
So suppose that $T$ and $U$ are in $S$. Then for any $x,y$, we have $$ T(x) \cdot y = x \cdot T(y)\\ U(x) \cdot y = x \cdot U(y) $$ and we'd like to show that $$ (T+U)x \cdot y = x \cdot (T+U)y $$ Fortunately, by the definition of addition of functions, we have \begin{align} (T+U)x \cdot y &= (T(x) + U(x)) \cdot y \\ &= T(x) \cdot y + U(x)\cdot y & \text{ by bilinearity of the inner product} \\ &= x \cdot T(y) + x\cdot U(y) & \text{ by our two hypotheses} \\ &= x \cdot (T(y) + U(y)) & \text{ bilinearity again} \\ &= x \cdot (T + U)(y) & \text{ def'n of addition of functions.} \\ \end{align}
The proof for closure under scalar multiplication is similar, and we're done.
What about dimensions? For that, the easiest proof seems to be to select a basis $b_1, \ldots, b_n$ of $\mathbb R^n$ -- the standard basis works fine. Let $T_{ij}$, where $i \ne j$, send all $b_k$ to zero except for $k = i, j$, and let it swap $b_i$ and $b_j$. Let $S_{i}$ send all $b_k$ to zero except $b_i$, which is sent to itself. That's a list of $n(n-1)/2 + n$ elements of the space of symmetric transforms. It takes a little work to show that they're linearly independent, but don't bother doing that.
Instead, consider the transformations $R_{ij}$ (for $i \ne j$) that send all $b_k$ to zero, except that $R_{ij} b_i = -b_j$ and $R_{ij} b_j = -b_i$. That gives you another $n(n-1)/2$ elements of the space of (all) linear maps. That space has dimension $n^2$, with a basis $Q_{ij}$ that takes $b_k$ to zero for all $k \ne i$, and takes $b_i$ to $b_j$. (In this case, $i$ and $j$ could be equal).
Now observe that for $i \ne j$, we have $$ Q_{ij} = \frac{1}{2} ( R_{ij} + T_{ij}) $$ and that for $i = j$, we have $$ Q_{ii} = S_{ii} $$ Hence the span of the $Q$s lies in the span of the $R$s and $T$s and $S$s, so that the collection of all $R$s, $T$s and $S$s is a spanning set for $L(R^n, R^n)$. But it's also a set with $n^2$ elements, so it's a basis. The $R$s and $T$s together span the set of symmetric transformations (this needs a little proof), so its dimension is $n(n+1)/2$.