Specific values for scalar product: generalizing orthogonality.

44 Views Asked by At

Suppose that I have a set of $k$ values $s_{1},\dots,s_{k}\in K$ and $k$ vectors $u_{1},\dots,u_{k}\in K^{r}$ for a field $K$ not necessarily closed and $k<r$ . Under which very general circumstances can I find a vector $w\in K^{r}$ such that $w\cdot u_{k}=s_{k}$ for each $k$? Observe that orthogonality would correspond to $s_{k}=0$ for all $k.$

1

There are 1 best solutions below

0
On BEST ANSWER

It is sufficient that $u_1,\ldots,u_k$ are linearly independent. You can extend this set of vectors to a basis $u_1,\ldots,u_r\in K^r$ and write $w=\sum_{k=1}^r w_ku_k$ with $w_k\in K$. Then you can define a diagonal inner product $\langle \cdot ,\cdot \rangle$ via $\langle u_i ,u_j \rangle=a_i\delta^i_j$, where $\delta^i_j=0$ if $i\neq j$ and $\delta^i_j=1$ if $i=j$. The initial equations $\langle w ,u_k \rangle=s_k$ will generate you a system of linear equations, where you have the freedom to choose arbitrary $s_{k+1},\ldots,s_r\in K$ to solve it for $a_1,\ldots,a_r$.

If $u_1,\ldots,u_k$ are not linearly independent the set of equations might already be overdetermined. For instance if $u_1=u_2$ and $s_1\neq s_2$ there won't be an inner product satisfying the set of equations.