If $c_{j}$ are any $n$ scalars, show that there is exactly one vector $\alpha$ in $V$ such that $\langle\alpha,\alpha_{j}\rangle = c_{j}$

138 Views Asked by At

Let $V$ be a finite-dimensional vector space and let $\mathcal{B} = \{\alpha_{1},\alpha_{2},\ldots,\alpha_{n}\}$ be a basis for $V$. Let $\langle\cdot,\cdot\rangle$ be an inner product on $V$. If $c_{1},c_{2},\ldots,c_{n}$ are any $n$ scalars, show that there is exactly one vector $\alpha$ in $V$ such that $\langle\alpha,\alpha_{j}\rangle = c_{j}$ for $j\in\{1,2,\ldots,n\}$.

MY ATTEMPT

Let $\alpha = a_{1}\alpha_{1} + a_{2}\alpha_{2} + \ldots + a_{n}\alpha_{n}$. Thus we have that \begin{align*} \langle\alpha,&\alpha_{1}\rangle = a_{1}\langle\alpha_{1},\alpha_{1}\rangle + a_{2}\langle\alpha_{2},\alpha_{1}\rangle + \ldots a_{n}\langle\alpha_{n},\alpha_{1}\rangle = c_{1}\\ \langle\alpha,&\alpha_{2}\rangle = a_{1}\langle\alpha_{1},\alpha_{2}\rangle + a_{2}\langle\alpha_{2},\alpha_{2}\rangle + \ldots a_{n}\langle\alpha_{n},\alpha_{2}\rangle = c_{2}\\ &\vdots\\ \langle\alpha,&\alpha_{n}\rangle = a_{1}\langle\alpha_{1},\alpha_{n}\rangle + a_{2}\langle\alpha_{2},\alpha_{n}\rangle + \ldots a_{n}\langle\alpha_{n},\alpha_{n}\rangle = c_{n}\\ \end{align*}

The question is: how do we prove the coefficient matrix of such linear system is invertible?

Any other approach would be appreciated, precisely a constructive method.

EDIT

Let us denote by $A$ the coefficient matrix of such linear system. We shall prove that $A$ is invertible by showing that $N(A) = \{0\}$. Let $\textbf{x}\in N(A)$. Then the following relation holds \begin{align*} A\textbf{x} = x_{1}A_{1} + x_{2}A_{2} + \ldots + x_{n}A_{n} = 0 \end{align*} If such equation admits another solution other than $\textbf{x} = 0$, there is a $x_{k} \neq 0$ so that $A\textbf{x} = 0$ holds. Given that, for $1\leq j \leq n$, we have that \begin{align*} & x_{j}A_{j} = -x_{1}A_{1} - \ldots - x_{j-1}A_{j-1} - x_{j+1}A_{j+1} - \ldots - x_{n}A_{n} \Longrightarrow\\\\ & x_{j}\langle \alpha_{j},\alpha_{j}\rangle = -x_{1}\langle\alpha_{1},\alpha_{j}\rangle - \ldots - x_{j-1}\langle\alpha_{j-1},\alpha_{j}\rangle - x_{j+1}\langle\alpha_{j+1},\alpha_{j}\rangle - \ldots - x_{n}\langle\alpha_{n},\alpha_{j}\rangle \Longrightarrow\\\\ & \langle x_{1}\alpha_{1} + x_{2}\alpha_{2} + \ldots + x_{n}\alpha_{n},\alpha_{j}\rangle = 0 \Longrightarrow \langle x_{1}\alpha_{1} + x_{2}\alpha_{2} + \ldots + x_{n}\alpha_{n},x_{j}\alpha_{j}\rangle = 0\\\\ & \langle x_{1}\alpha_{1} + x_{2}\alpha_{2} + \ldots + x_{n}\alpha_{n}, x_{1}\alpha_{1} + x_{2}\alpha_{2} + \ldots + x_{n}\alpha_{n}\rangle = 0 \Longrightarrow\\\\ & x_{1}\alpha_{1} + x_{2}\alpha_{2} + \ldots + x_{n}\alpha_{n} = 0 \Longrightarrow \{\alpha_{1},\alpha_{2},\ldots,\alpha_{n}\}\,\,\text{is linear dependent}\,\, \end{align*} which contradicts the given hypothesis. Hence the proposed result holds.

1

There are 1 best solutions below

1
On BEST ANSWER

The $n\times n$ matrix whose $i,j$ entry is $\langle \alpha_i, \alpha_j \rangle$ is invertible if and only if $\{\alpha_1, \ldots, \alpha_n\}$ is linearly independent.

Let $v_1, \ldots, v_n$ be the columns of the matrix.

If $\{\alpha_1, \ldots, \alpha_n\}$ is not a basis, then there exist scalars $b_1,\ldots, b_n$ such that $b_1 \alpha_1 + \cdots + b_n \alpha_n = 0$, then $b_1 v_1 + \cdots + b_n v_n = 0$ as well, making the matrix not invertible.

Conversely if the matrix is not invertible, there exist scalars such that $b_1 v_1 + \cdots + b_n v_n = 0$. This implies $\langle b_1 \alpha_1 + \cdots + b_n \alpha_n, \alpha_j\rangle = 0$ for each $j$. If any of the $\alpha_j$ are zero, then $\{\alpha_1, \ldots, \alpha_n\}$ is automatically not a basis. Otherwise if all the $\alpha_j$ are nonzero, then $b_1 \alpha_1 + \cdots + b_n\alpha_n = 0$, so $\{\alpha_1, \ldots, \alpha_n\}$ is not a basis.