I tried to solve the task 3.6.a in the book Mathematics for Machine Learning
Consider $\mathbb{R}^{3}$ with the inner product
$$\left \langle x,y \right \rangle:=x^{T}\begin{bmatrix} 2 & 1 & 0\\ 1 & 2 & -1\\ 0 & -1 & 2 \end{bmatrix}y.$$
Furthermore, we define $e_{1},e_{2},e_{3}$ as the standard/canonical basis in $\mathbb{R}^{3}$.
Determine the orthogonal projection $\pi_{U}(e_{2})$ of $e_{2}$ onto $U=\text{span}[e_{1},e_{3}]$.
and i hope you forgive if i post the question with the link because it is difficult to write.
As $\pi_{U}(e_{2})\in U$, we can define $\Lambda = (\lambda_{1},\lambda_{2})\in \mathbb{R}^{2}$ such that $\pi_{U}(e_{2})=U\Lambda$. $\pi_{U}(e_{2})$ becomes $\pi_{U}(e_{2})=\lambda_{1}e_{1}+\lambda_{3}e_{3}=[\lambda_{1},0,\lambda_{3}]^{\top}$ expressed in the canonical basis.
By orthogonal projection,
\begin{align*} &(\pi_{U}(e_{2})-e_{2}) \perp U\\ \implies &\begin{bmatrix} \left \langle \pi_{U}(e_{2})-e_{2},e_{1} \right \rangle\\ \left \langle \pi_{U}(e_{2})-e_{2},e_{3} \right \rangle \end{bmatrix}=\begin{bmatrix} 0\\ 0\\ \end{bmatrix}\\ \implies &\begin{bmatrix} \left \langle \pi_{U}(e_{2}),e_{1} \right \rangle-\left \langle e_{2},e_{1} \right \rangle\\ \left \langle \pi_{U}(e_{2}),e_{3} \right \rangle-\left \langle e_{2},e_{3} \right \rangle \end{bmatrix}=\begin{bmatrix} 0\\ 0\\ \end{bmatrix}. \end{align*}
We compute the individual components as
$$\left \langle \pi_{U}(e_{2}),e_{1} \right \rangle=\left [ \lambda_{1} \text{ }\text{ }\text{ }0\text{ }\text{ }\text{ } \lambda_{3} \right ] \begin{bmatrix} 2 & 1 & 0\\ 1 & 2 & -1\\ 0 &- 1 & 2 \end{bmatrix}\begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}=2\lambda_{1}.$$
Source: M. P. Deisenroth, A. A. Faisal.