Suppose that $X=(X_1,\ldots,X_n)'$ is a random vector with values in $\mathbb C^n$ with weakly stationary zero mean complex-valued random variables $X_1,\ldots,X_n$. The standard inner product of $\mathbb C^n$ is given by $\langle x,y\rangle=x'\overline y=\sum_{i=1}^nx_i\overline y_i$. Let us define the discrete Fourier transform of $X$ as $\langle X,e_j\rangle$ for $-\lfloor(n-1)/2\rfloor\le j\le\lfloor n/2\rfloor$ and $n\ge1$, where $e_j$'s are orthonormal vectors given by
$$
e_j=n^{-1/2}\left(\begin{array}{ccc}e^{i\omega_j}&e^{i2\omega_j}& \ldots & e^{in\omega_j}\end{array}\right)'
$$
with $\omega_j=2\pi j/n$.
Is $\operatorname{Cov}[\langle X,e_j\rangle,\langle X,e_k\rangle]=0$ for $j\ne k$? If not, is it true that $\operatorname{Cov}[\langle X,e_j\rangle,\langle X,e_k\rangle]\to0$ as $n\to\infty$ for $j\ne k$ if the autocovariances are absolutely summable?
Since $\operatorname E\langle X,e_j\rangle=0$, we have that $$ \operatorname{Cov}[\langle X,e_j\rangle,\langle X,e_k\rangle]=\operatorname E[\langle X,e_j\rangle\langle e_k,X\rangle]=n^{-1}\sum_{s,t=1}^ne^{-i(t\omega_j-s\omega_k)}\operatorname E[X_t\overline X_s], $$ but this does not seem that useful. We also have that $$ \operatorname E[\langle X,e_j\rangle\langle e_k,X\rangle] =\operatorname E[\langle e_k,\langle e_j,X\rangle X\rangle] =\langle e_k,Ce_j\rangle $$ where $C=\operatorname E[X\overline X']$ is the covariance matrix of $X$ and the question is now equivalent to the following question: are $e_k$ and $Ce_j$ orthogonal vectors if $e_k$ and $e_j$ are orthogonal vectors? I guess that in general the answer is negative, but what are the conditions on the matrix $C$ so that the vectors $Ce_j$ and $e_k$ are still orthogonal?
Any help is much appreciated!