For the sum of two cosine functions, we have the formula:
$$A\sin \alpha + B\sin \beta = (A+B)\sin\left(\frac{\alpha+\beta}{2}\right)\cos\left(\frac{\alpha-\beta}{2}\right)+(A-B)\sin\left(\frac{\alpha-\beta}{2}\right)\cos\left(\frac{\alpha+\beta}{2}\right)$$
We can write this as:
$$A\sin \alpha + B\sin \beta = \sum_{k=1}^2\lambda_k(A,B)P_k^{(2)}(\alpha,\beta)$$
where $\lambda_i(A,B)$ are linear functions in the coefficients $A,B$, and $P_k^{(2)}(\alpha,\beta)$ are quadratic trigonometric polynomials, namely:
$$\lambda_1(A,B) = A+B, \qquad \lambda_2(A,B) = A-B$$ $$P_1^{(2)} = \sin\left(\frac{\alpha+\beta}{2}\right)\cos\left(\frac{\alpha-\beta}{2}\right), \qquad P_2^{(2)} = \sin\left(\frac{\alpha-\beta}{2}\right)\cos\left(\frac{\alpha+\beta}{2}\right)$$
For three sine functions, I conjecture that there is a formula of the form:
$$A\sin \alpha + B\sin \beta + C\sin \gamma = \sum_{k=1}^2\lambda_k(A,B,C)P_k^{(3)}(\alpha,\beta,\gamma)$$
$\lambda_k(A,B,C)$ linear funtions in $(A,B,C)$, and $P_k^{(3)}(\alpha,\beta,\gamma)$ cubic trigonometric polynomials.
My second attempt was to use the formula:
$$\begin{align} \prod_{k=1}^n \cos \theta_k & = \frac{1}{2^n}\sum_{e\in S} \cos(\epsilon_1\theta_1+\cdots+\epsilon_n\theta_n) \\[6pt] & \text{where }e = (\epsilon_1,\cdots,\epsilon_n) \in S=\{1,-1\}^n \end{align}$$ thus we have $\epsilon_i = \pm 1$, this can be used for computing a sum of three cosines, which can then be easily converted into a sum of three sines.
Note: The problem arises wave mechanics whether it would be interesting to sum up an arbitrary number of complex exponential functions as pre-factor + complex exponential:
$$\sum_{i=1}^n A_i e^{ik_ix} = B(A_1,\dots,A_n;k_1,\dots,k_n;x) e^{i(k_1+\dots+k_n)x/2}$$
I think you are on the right lines, and I think you can be more explicit about the shape of the formula. I'd like to give an explicit formula, but can't. So this is just a proof that the shape of the formula you conjecture is correct.
I find it slightly easier to work with cosines and then deduce the sine case later. So let us consider $\sum_{i=1}^{n} A_i\cos\alpha_i$.
It is easy to see that we can write $2^{n-1}\alpha_1$ as a sum of the $2^{n-1}$ linear combinations $\sum_i \epsilon_i \alpha_i$, where $\epsilon_1=1$, $\epsilon_i=\pm 1$.
We need now to expand $\cos\alpha_1=\cos\sum_{\epsilon}\frac{1}{2^{n-1}}\sum_i \epsilon_i \alpha_i$. Recall that we can represent $\exp(i\theta)$ by the matrix $\begin{bmatrix}\cos\theta &\sin\theta\\-\sin\theta &\cos\theta\end{bmatrix}$, and use the fact that $$\exp(i\alpha_1)= \exp(i\sum_{\epsilon}\frac{1}{2^{n-1}}\sum_r \epsilon_r \alpha_r)= \prod_\epsilon\exp(\frac{1}{2^{n-1}}\sum_r \epsilon_r \alpha_r) $$ to see that $\cos\alpha_1$ is just the $(1,1)$ entry of the matrix $$\prod_\epsilon\begin{bmatrix}\cos\frac{1}{2^{n-1}}\sum_i\epsilon_i\alpha_i & \sin\frac{1}{2^{n-1}}\sum_i\epsilon_i\alpha_i\\-\sin\frac{1}{2^{n-1}}\sum_i\epsilon_i\alpha_i&\cos\frac{1}{2^{n-1}}\sum_i\epsilon_i\alpha_i \end{bmatrix}.$$ This entry is clearly a sum of trigonometric polynomials of degree $2^{n-1}$, and the arguments of these polynomials are all of the form $\frac{1}{2^{n-1}}$ times a sum of $\pm\alpha_i$.
The same is true for each of the other $\alpha_i$ except that we write each not as a sum of the $2^{n-1}$ elements $\sum_i\epsilon_i\alpha_i$, but as a linear combination with coefficients $\pm 1$. That does not alter the final shape.
Finally we just take the appropriate linear combinations of these to get the final result.
For the sine result we just need to read off the $(1,2)$-entry of the matrix, clearly all the properties we are looking for still hold. Indeed we could deal with a linear combination of sines and cosines and get a formula of the same sort.
[Comment: I can't produce an explicit formula in the general case. One needs to keep track of the signs $\epsilon_i$, and the sign changes needed to allow us to move from $\alpha_1$ to $\alpha_i$.]