Suppose I have a set of $k$ linearly independent vectors $V\subset\mathbb{R}^k$ embedded in $d$-dimensional space, $d > k$. I want to find a vector $x\in\mathbb{R}^d$ and constant $c$ such that $\forall v\in V.\ x^Tv = c$. In other words, for each $v\in V$, the cosine of the angle between $x$ and $v$ is inversely proportional to the norm of $v$. In the special case where all vectors in $V$ have the same norm, this is equivalent to finding a vector $x$ that has the same angle with all the vectors.
Obviously I could choose $x$ to be orthogonal to $V$, in which case $c = 0$, but I'm looking for a non-zero solution. Is there anything I can do to characterize the angle(s) or the value $c$ (up to the norm of $x$)? Or could they vary anywhere between $0^\circ-180^\circ$?
Note that if $x^Tv = c$ for $c \ne 0, v \in V$, then $\left(\frac xc\right)^Tv = 1$ for all $v \in V$. Thus it suffices to solve the equation for $c = 1$, and all other solutions are just $c$ times a solution for $1$.
Let $W$ be the span of $V$ as embedded in $\Bbb R^d$. The set $W^\perp$ of all vectors perpendicular to $W$ is also a subspace of $\Bbb R^d$ and every $x \in \Bbb R^d$ has a unique decomposition $x = x_W + x_\perp$ into elements of $W$ and $W^\perp$. Since $V \subset W$, for all $v \in V$, $$x^Tv = x_W^Tv + x_\perp^Tv = x_W^Tv + 0$$ Hence any $x$ having the property you are after is the sum of a vector $x_W \in W$ with the identical property, and some vector $x_\perp$ orthogonal to $W$.
So it also suffices to solve the problem within $W \equiv \Bbb R^k$. All solutions in $W$ can then be combined with all orthogonal vectors to get all solutions in $\Bbb R^d$.
Since $V$ consists of $k$ linearly independent vectors in $k$ dimensional space, $V$ is a basis for $\Bbb R^k$. Index the elements of $V = \{V_i\}_{i=0}^k$ and define the constants $v_{ji} = V_i^T V_j$. We can write any vector $$x = \sum_{i=1}^k x_iV_i$$ for some $x_i \in \Bbb R$. Then for all $j$, $$1 = x^TV_j = \sum_{i=1}^k x_iV_i^TV_j = \sum_{i=1}^k v_{ji}x_i$$
The equations $$\sum_{i=1}^k v_{ji}x_i = 1$$ for all $j$ form a system of $k$ linear equations in the $k$ unknowns $x_i$. Because the $V_i$ are linearly independent, this system has to be linearly independent as well, so it has a unique solution, which you can find by the standard techniques - Guassian elimination in particular.
Once you found the solution $x$ in $\Bbb R^k$ for $c=1$, all solutions in $\Bbb R^k$ are $cx$ for some $c$. And all solutions in $\Bbb R^d$ are $cx + y$ where $y$ is any vector perpendicular to $\Bbb R^k$.