Let $A$ be a linear transformation from $V \to V$ where $(V, (\cdot,\cdot))$ is a finite-dimensional inner product space. Consider the inner product $(x, Ay)$ for fixed $x$ and varying $y$. Then there exists a unique vector $z(x)$ depending on $x$ such that $(x, Ay) = (z(x), y)$ for all $y \in V$.
What is the simplest proof of this? My book says it's a consequence of Gram-Shmidt orthonormalization, but I'm not seeing that.
However, I guess we can say we have an orthonormal basis $b_i$, so that putting $x = \sum x_i b_i$ and $y = \sum y_i b_i$ we get $(x, Ay) = \sum x_i A(y_i)$ by linearity of $(\cdot, \cdot)$ and definition of orthonormal.
So let $A(y_i) = \sum a_{ij} b_j$. Then further rewriting gives $(x, Ay) = \sum_{i,j} a_{ij} x_i$. We want a vector $z = \sum_i z_i b_i$ such that $(x, \sum_i z_i b_i) = (x, Ay)$ or $\sum_{i} x_i z_i = \sum_{i,j} a_{i,j} x_i$.
I see now, it must be $z_i = \sum_j a_{i,j}$.
To prove uniqueness it's what?
To me (as in your sugestion) the most natural proof is to express $z(x)$ in terms of a certain basis $v_1,\ldots,v_n$ of $V$ (assume this basis is orthonormal). Then we need to find coefficients $\lambda_i$, such that $$z(x) =\lambda_1 v_1+\ldots+\lambda_n v_n.$$ Then $$(y,z(x)) = \sum_i y_i \lambda_i$$ and $$ (x,Ay) = \sum_j x_j (Ay)_j = \sum_{i,j} x_j a_{ji}y_i. $$ Comparing indices, we need that (take $y = v_i$ and enumerate all $i$'s) $$ \lambda_i = \sum_j x_j a_{ji}.$$ This means that $$z(x) = (v_1,\cdots,v_n)A^T \begin{pmatrix} x_1 \\ \vdots \\ x_n \end{pmatrix}.$$