Given $x_1, \ldots, x_n\in \mathbb{R}^D$.
Prove that for all $w\in \mathbb{R}^D$, there exists $\alpha_i\in \mathbb{R}$, $v\in \mathbb{R}^D$, such that $$w = \sum_{i=1}^n\alpha_ix_i+v,\\ \forall i,\left<v,x_i\right>=0$$
Given $x_1, \ldots, x_n\in \mathbb{R}^D$.
Prove that for all $w\in \mathbb{R}^D$, there exists $\alpha_i\in \mathbb{R}$, $v\in \mathbb{R}^D$, such that $$w = \sum_{i=1}^n\alpha_ix_i+v,\\ \forall i,\left<v,x_i\right>=0$$
Let $X_{D\times k}=\left(x_{t_1},\ldots,x_{t_k} \right)$ be the maximal linearly independent set of $x_i(i\leq n)$, then $$\mathrm{rank}(X^TX) = \mathrm{rank}(X) = k$$ So $X^TX$ is invertible.
Given any vector $w_{D\times 1}$, let $A_{k\times 1} = (X^TX)^{-1}X^Tw$, $v_{D\times 1} = w-XA$, then $$\begin{align*} X^Tv &= X^T(w-XA)\\ &= X^T(w-X(X^TX)^{-1}X^Tw)\\ &= X^Tw - X^Tw\\ &= \mathbf{0}_{k\times 1} \end{align*}$$ This means $\left<v,x_{t_i}\right>=0, \forall i\leq k$. Because $x_i$ can be represented as a linear combination of $x_{t_i}$, we have $\left<v,x_i\right>=0, \forall i\leq n$.
Let the weight of $x_{t_i}$ be $A_i$ and the weight of $x_i\not\in X$ be zero, then $w$ is decomposed into the linear combination of $x_i$ and the orthogonal vector $v$.