I'm tying to solve a simple linear algebra problem. Let us assume that I have a vector space of dimension $R^N$. In that space I know two vectors $u$ and $v$. I want to find a vector $w$ that maximizes the inner product with $u$ but which minimizes the inner product with $v$ (ideally with a null product : $\int v(r)\cdot w(r)\cdot dr = 0$).
As far as I can understand, the vector $w$ is part of the hyperplan $D$ orthogonal to $v$, and is just the orthonormal projection $P$ of $u$ on it. We can requiere $w$ to be of unity norm for the unicity solution. Is there anyway to solve this problem from a generic set of vector $u$ and $v$ ? I think that this problem is a quite known problem that has been solved centuries ago but I didn't manage to find its name, does anyone know some documentations somewhere that formulate the algorithm to solve for it ?
Apparently I don’t have enough reputation to comment, but you may need to give a little more information. For example, if we can find an orthogonal vector to $v$, call is $w$, that is not orthogonal to $u$, we can scale it to make the inner product of $w$ and $u$ as large as you want while the inner product of $w$ and $v$ is 0. So maybe you need more restrictions?
Also, by generic set of vectors $u$ and $v$, do you mean multiple $v_i$ vectors and same for $u_i$? If so, finding a vector orthogonal to all $v_i$ is just solving a system of linear equations right? Just take the dot product of an arbitrary $w$ and each $v_i$. Don’t know if this helps and sorry I can’t comment this stuff.
Edit in response to comment:
If you are restricted to unit vectors, it’ll help to find an orthonormal basis for the space orthogonal to $v$. I think something like Graham-Schmidt should do the trick, where you start with $v$ and any basis for the space. You’ll have one vector go to zero and throw that out, the other $n-1$ should have the needed info. Once you do this, you can project $u$ onto this space more easily and I think that should be the answer you want.