Finding unit vector (with sum of components also zero) with smallest cosine distance with another given vector

117 Views Asked by At

So I came upon this problem when solving trying to figure out how to find Nash equilibrium of two cost functions both of which are functions of the same agents adversarially.

The following is the problem:

$ \alpha^* = argmax_{\alpha} \alpha \cdot \beta $

$\sum_i \alpha_i = 0$

$\sum_i \alpha_i^2 = 1$

Any pointers how I can solve this?

1

There are 1 best solutions below

0
On BEST ANSWER

I suppose that $\beta_i$ is the given vector. There are (at least) two solutions. The first, most easy one is a geometrical one and is based on the fact that $\alpha_i$ is restricted to a hyperplane through the origin with normal vector $n = (1,1,\ldots, 1)$. Note that the scalar product $\alpha \cdot n = 0$. First, starting at the endpoint of the vector $\beta$ we descend a perpendicular until we reach the hyperplane, i.e we determine a point $\beta - rn$ such that $\sum_i{(\beta - rn)_i} = 0$. This gives us a value for $r$, namely $r = \sum_i{\beta_i}/d$, where $d$ is the dimension of the vector space. Now $\beta - n\sum_i{\beta_i}/d$ is potentially the vector $\alpha_i$, with the exception that its length is not $1$, so all we have to do is divide this vector by its norm $\left\| \beta - n\sum_i{\beta_i}/d\right\| = \sqrt{(\sum_j{(\beta_j - \sum_i{\beta_i}/d)^2}}$, so finally : $$\alpha_i = (\beta_i-\sum_j{\beta_j}/d)/\sqrt{(\sum_j{(\beta_j - \sum_k{\beta_k}/d)^2}} $$.