If we let $\theta\in\mathbb{R}^n$ be a vector that contains $n$ arbitrary phases $\theta_i\in[0,2\pi)$ for $i\in[n]$, then we can define a matrix $X\in\mathbb{R}^{n\times n}$, where \begin{align*} X_{ij} = \theta_i - \theta_j. \end{align*} Then the matrices that I consider are the antisymmetric matrix $A=\sin(X)$ and the symmetric matrix $B=\cos(X)$. Through numerical experiments (by randomly sampling the phase vector $\theta$) I find that the nuclear norm of $A$ and $B$ are always $n$, i.e. \begin{align*} \|A\|_* = \|B\|_* = n. \end{align*}
Moreover, performing SVD on $A$ yields the largest two singular value $\sigma_1 = \sigma_2 = n/2$ and all the other $\sigma_3 = \ldots = \sigma_n = 0$. Further, if we look at the matrix $A\circ B$, where \begin{align*} (A\circ B)_{ij} = \sin(\theta_i - \theta_j)\cos(\theta_i - \theta_j) = \sin(2(\theta_i - \theta_j))/2, \end{align*} then \begin{align*} \|A\circ B\|_* = n/2 \end{align*} with $\sigma_1 = \sigma_2 = n/4$ and $\sigma_3 = \ldots = \sigma_n = 0$.
Is there any way to see why $A$ and $B$ have these properties?
I will stick to your notation in which $f(X)$ refers to the matrix whose entries are $f(X_{ij})$.
Note that by Euler's formula, we have $$ \sin(X) = \frac 1{2i}[\exp(iX) - \exp(-iX)] $$ To see that $\exp(iX)$ has rank $1$, we note that it can be written as the matrix product $$ \exp(iX) = \pmatrix{\exp(i\theta_1) \\ \vdots \\ \exp( i\theta_n)} \pmatrix{\exp(-i\theta_1) & \cdots & \exp( -i\theta_n)} $$ Verify also that $\exp(iX)$ is Hermitian (and positive definite), as is $\exp(-iX)$.
So far, we can conclude that $\sin(X)$ has rank at most equal to $2$.
Since $\exp(iX)$ is Hermitian with rank 1, we can quickly state that $$ \|\exp(iX)\|_* = |\operatorname{tr}(\exp(iX))| = n $$ So, your numerical evidence seems to confirm that $$ \left\|\frac 1{2i}[\exp(iX) - \exp(-iX)]\right\|_* = \left\|\frac 1{2i}\exp(iX)\right\|_* + \left\|\frac 1{2i}\exp(-iX)\right\|_* $$
From there, we note that $A = \sin(X)$ satisfies $$ 4 A^*A = [\exp(iX) - \exp(-iX)]^2 = \\ n [\exp(iX) + \exp(-iX)] - \exp(iX)\exp(-iX) - \exp(-iX)\exp(iX) =\\ n [\exp(iX) + \exp(-iX)] - 2 \operatorname{Re}[\exp(iX)\exp(-iX)] $$ where the exponent here is used in the sense of matrix multiplication. Our goal is to compute $\|A\|_* = \operatorname{tr}(\sqrt{A^*A})$.
Potentially useful observations:
We note that $$ \exp(iX)\exp(-iX) = \pmatrix{\exp(i\theta_1) \\ \vdots \\ \exp( i\theta_n)}\pmatrix{\exp(i\theta_1) & \cdots & \exp( i\theta_n)} \sum_{k=1}^n \exp(-2i\theta_k) $$ And $\operatorname{tr}[\exp(iX)\exp(-iX)] = \left| \sum_{k=1}^n \exp(2i\theta_k) \right|^2$. This product is complex-symmetric but not Hermitian.
The matrices $\exp(iX),\exp(-iX)$ will commute if and only if $\exp(iX)\exp(-iX)$ is purely real (i.e. has imaginary part 0).
I think that these matrices will commute if and only if $\sum_{k=1}^n \exp(2i\theta_k) = 0$ (which is not generally the case).