I have a multidimensional unitary vector $\vec n$. I need to find another unitary vector $\vec r$ which maximizes the scalar product: \begin{equation} (\vec r, \vec n) = r_1 \cdot n_1 + r_2 \cdot n_2 + \dots r_k \cdot n_k \end{equation} Obviously, without constrains $\vec r$ should be equal to $\vec n$. However, I have two types of constrains.
First, every component has an (component-specific) upper and lower bound: \begin{equation} l_i \geq r_i \geq u_i \end{equation}
Second, there are several (overlapping) groups of components for which sum of components should sum up to 1. For example:
\begin{align} r_1 + r_7 + r_9 + r_{11} = 1 \\ r_3 + r_7 + r_9 + r_{12} = 1 \end{align}
How can I find my vector $\vec r$. This is not a homework. I need to solve this problem since I run a gradient based constrained optimization. Vector $\vec n$ is my gradient.
Hint
The set of equations $r_{k_1}+r_{k_2} + \cdots =1$ plus the $\bf r \cdot \bf n = \cos \alpha$ represent a set of hyperplanes.
Their intersection will lead to express $r_k$ , unless degenerations, as a linear combination of $\lambda _1 , \lambda _2 , \cdots \lambda _m$ and $c = \cos \alpha$.
Then you are left with a set of constrains for linear combinations of $\lambda _k $ and $c$, including $-1 \le c \le 1$, to maximize for $c$.