I have a signal sampled on $n$ points which cover a period, and I determine the amplitude of the fundamental by convolving with cosinusoid and sinusoid samples.
$$C:=\sum_{k=0}^{n-1}x_k\cos\frac{2\pi k}{n},$$
$$S:=\sum_{k=0}^{n-1}x_k\sin\frac{2\pi k}{n}.$$
I need to know the maximum value that can be taken by the modulus $\sqrt{C^2+S^2}$ when the $x_k$'s are bounded in range $[-M, M]$.
I can easily find the maximum of $C$ or $S$ separately, by setting $x_k=M$ for the positive $\cos$ (resp. $\sin$) and $x_k=-M$ otherwise. But I need to solve this for the sum of squares $C^2+S^2$. (An upper bound is $2M^2$, obviously, but it is not tight.)
Any hint ?
Update:
As shown by @Adrian,
$$C^2+S^2=\sum_{k=0}^{n-1}\sum_{j=0}^{n-1}x_kx_j\left(\cos\frac{2\pi k}{n}\cos\frac{2\pi j}{n}+\sin\frac{2\pi k}{n}\sin\frac{2\pi j}{n}\right)=\sum_{k=0}^{n-1}\sum_{j=0}^{n-1}x_kx_j\cos\frac{2\pi(k-j)}{n}.$$
This is a quadratic expression, which we have to maximize under the constraints $|x_k|\le M$.
The key is this trig identity: $$S=\sum_k x_k \sin(x+\theta_k)=\sqrt{\sum_{i,j}x_i x_j \cos(\theta_i-\theta_j)}\,\sin\left(x+\arctan\left(\frac{\sum_k x_k\sin(\theta_k)}{\sum_k x_k\cos(\theta_k)}\right)\right), $$ which you can produce by combining the trig identities here. Similarly, you can show that $$C=\sum_k x_k \cos(x+\theta_k)=\sqrt{\sum_{i,j}x_i x_j \cos(\theta_i-\theta_j)}\,\cos\left(x+\arctan\left(\frac{\sum_k x_k\sin(\theta_k)}{\sum_k x_k\cos(\theta_k)}\right)\right). $$ Applying this to your problem, we find that $x=0,$ and \begin{align*} S^2&=\sum_{i,j}x_i x_j \cos\left(\frac{2\pi(i-j)}{n}\right)\sin^2(\alpha) \\ C^2&=\sum_{i,j}x_i x_j \cos\left(\frac{2\pi(i-j)}{n}\right)\cos^2(\alpha), \end{align*} where $$\alpha=\arctan\left(\frac{\sum_k x_k \sin\left(\frac{2\pi k}{n}\right)}{\sum_k x_k \cos\left(\frac{2\pi k}{n}\right)}\right) $$ It follows that $$C^2+S^2=\sum_{i,j}x_i x_j \cos\left(\frac{2\pi(i-j)}{n}\right), $$ a nearly miraculous simplification, in my opinion. This is a quadratic expression that we can write as $\mathbf{x}^T\mathbf{A}\mathbf{x},$ where \begin{align*} \mathbf{x}&=\left[\begin{matrix}x_0 \\ x_1 \\ \vdots \\ x_{n-1}\end{matrix}\right]\\ A_{ij}&=\cos\left(\frac{2\pi(i-j)}{n}\right). \end{align*} Then this question comes into play, and the problem becomes one of eigenvalues and eigenvectors. Because $\mathbf{A}$ is real and symmetric, it has an eigen-decomposition $\mathbf{SDS}^T,$ where $\mathbf{S}$ is orthogonal and $\mathbf{D}$ is diagonal. The crucial line of the answer there is that $$\mathbf{x}^T\mathbf{Ax} = \mathbf{x}^T\mathbf{SDS}^T\mathbf{x} =\mathbf{y}^T\mathbf{Dy} = \sum\limits_{i=1}^{n} y_i^2 \lambda_i. $$ If the $\mathbf{S}$ is just a rotation, then $\|\mathbf{x}\|_{\infty}=M$ and $\|\mathbf{y}\|_{\infty}\le\sqrt{n}\,M,$ so that: $$\max\!\left(C^2+S^2\right)\le nM^2\left[\sum_{\lambda>0}\lambda - \sum_{\lambda<0}\lambda\right]=nM^2\sum|\lambda|. $$