I am trying to solve the following optimization problem:
$$\max_{x} \alpha^T x \text{ subject to } \sum_{n=}^{N} x_n^2 \leq 1$$ Using the Cauchy-Schwarz inequality, I was able to obtain $$\max_{x} \langle\alpha, x\rangle \leq ||\alpha||_2 $$ which means $$\alpha^T x^* = \sqrt{\alpha^T \alpha}$$
My question is, how do you find a solution for $x^*$? Since $x$ is a vector, we can't just divide by $\alpha$ to solve for the optimal solution. I am still very new to optimization, but I think the general idea is that we want to express our objective function in terms of the constraint in order to find an upper bound which I was able to do using Cauchy-Schwarz. I still am unable to determine how to solve for $x$ though. Maybe I'm also a bit rusty on vector algebra. Thanks.
EDIT
I see that equality happens in the Cauchy-Schwarz equation when $x$ is linearly dependent on $\alpha$. Using this fact, I came to the following result, $$x^* = \frac{\alpha}{||\alpha||_2} $$
I believe this logic is correct, but could someone correct me if I am wrong?
With Lagrange multipliers.
$$ L(x,\lambda,s) = \alpha'x+\lambda(\|x\|^2-1+s^2) $$
The stationary points obey
$$ \nabla L = 0 = \cases{\alpha+2\lambda x\\ \|x\|^2-1+s^2 \\ \lambda s = 0} $$
now making $s=0$ (restriction is actuating) we have
$$ \cases{\alpha+2\lambda x = 0\\ \|x\|^2-1 = 0}\ \ \Rightarrow x^* = \pm \frac{\alpha}{\|\alpha\|} $$
etc.