I have trouble understanding a proof on textbook and I would appreciate your help!
Corollary. Let $V$ be a finite-dimensional inner product space with an orthonormal basis $\beta = \{v_1, v_2, \ldots, v_n \}$. Let $T$ be a linear operator on $V$, and let $A =_{\beta}[T]^{\beta}$. Then for any $i$ and $j$, $A_{ij} = \langle T(v_j),v_i \rangle$.
Proof. From previous theorem, we know that $T(v_j)=\sum_{i=1}^{n}\langle T(v_j), v_i \rangle v_i$. Hence $A_{ij} = \langle T(v_j), v_i \rangle $.
I don't understand the part after "Hence". I was thinking about maybe multiplying some vectors but it didn't work out.
By definition, $A_{i,j}$ is the component of $v_i$ in the decomposition of $T(v_j)$ in the basis $\beta$.
As $$T(v_j) = \sum_{i=1}^n \langle T(v_j) ,v_i\rangle v_i\\ =\sum_{i=1}^n A_{j,i} v_i $$ and as the family $(v_k)$ is free, $A_{i,j} = \langle T(v_j) ,v_i\rangle$.