Prove that there is a unique $v \in V$ for which $\langle v,v_k\rangle=c_k$ for all $k$

105 Views Asked by At

Let $V$ be a finite dimensional vector space over $\mathbb{R}$, with fixed basis $B=\{v_1,\dots,v_n\}$. Suppose $\langle u,v\rangle$ is an inner product on $V$. If $c_1,\dots,c_n$ are arbitrary scalars, prove that there is a unique $v\in V$ for which $\langle v,v_k \rangle = c_k$ for all $k$.

I'm not even quite sure if I understand the question properly. How can I get started on this question?

4

There are 4 best solutions below

0
On

Hint: Fixed $v$ the scalar product is a linear functional and you get its values on a base...

1
On

First, there exists a unique $\;\phi\in V^*\;$ (i.e., a unique linear functional) s.t. $\;\forall\;1\le k\le n\;,\;\;\phi v_k=c_k$

Next, from Riesz Representation Theorem, there exists a unique $\;v\in V\;$ s.t. for all

$$\;u\in V\;,\;\;\phi u=\langle v,u\rangle\;$$

0
On

Consider the following linear system:

$$ \begin{cases} \langle v,v_1\rangle=c_1\\ ....\\ ....\\ \langle v, v_n\rangle=c_n \end{cases} $$

The matrix associate to it is invertible, since its rows are the vectors $v_k$ which form a basis for $V$. Then there exists a unique solution $v$ for every $c_1,...,c_n$.

0
On

Let $v = \sum \limits _{k=1} ^n x_k v_k$, with $x_k \in \Bbb R$ being the unknown components of $v$ in the given basis.

Introducing the notation $a_{kl} = \langle v_k, v_l \rangle$, you are told that

$$c_k = \langle v, v_k \rangle = \langle \sum \limits _{l=1} ^n x_l v_l, v_k \rangle = \sum \limits _{l=1} ^n x_l \langle v_l, v_k \rangle = \sum \limits _{l=1} ^n x_l a_{lk} .$$

Writing the above in a more visual form, what you have obtained is

$$\left\{ \begin{eqnarray} x_1 a_{11} + x_2 a_{21} + \dots + x_n a_{n1} = c_1 \\ x_1 a_{12} + x_2 a_{22} + \dots + x_n a_{n2} = c_2 \\ \vdots \\ x_1 a_{1n} + x_2 a_{2n} + \dots + x_n a_{nn} = c_n . \end{eqnarray} \right.$$

According to the general theory of systems of linear equations, this system of equations has a unique solution if and only if $\det (a_{ij}) \ne 0$.

Assume that $\det (a_{ij}) = 0$. This implies that the rows of the matrix are linearly dependent, which means that there exist the numbers $\alpha_1, \dots, \alpha_n \in \Bbb R$, not all of them $0$, such that $\alpha_1 a_{i1} + \dots + \alpha_n a_{in} = 0 \ \forall i$. Remembering who $a_{ij}$ is, this means that $\alpha_1 \langle v_i, v_1 \rangle + \dots + \alpha_n \langle v_i, v_n \rangle = 0 \ \forall i$ or, by linearity, that $\langle v_i, \sum \limits _{j=1} ^n \alpha_j v_j \rangle = 0 \ \forall i$. Since $\{v_i\}_{i=1, \dots, n}$ is a basis, the previous formula implies that $\langle w, \sum \limits _{j=1} ^n \alpha_j v_j \rangle = 0 \ \forall w$. But an inner product cannot be degenerate (by definition), so necessarily $\sum \limits _{j=1} ^n \alpha_j v_j = 0$, which in turn implies that $\alpha_i = 0 \ \forall i$, which is a contradiction with the assumption that not all of the $\alpha$-s are $0$. Therefore, the rows of the matrix $(a_{ij})$ must be linearly independent, so $\det (a_{ij}) \ne 0$, so the solution $\{x_1, \dots, x_n\}$ exists and is unique, and so will be the vector $v$ constructed from it.