Let $q$ be a quadratic form, with associated symmetric matrix $[\alpha_{ij}]$. Let $B=\{u_1, ..., u_n\}$ be a basis of eigenvectors of $A$, with $\{\lambda_1, ... \lambda_n\}$ as associated eigenvalues. Prove that $q(u_i)=\lambda_i$ for all $i.$
My attempted proof is as follows:
I know $q(x) = \sum_{i,j = 1}^n \alpha_{ij}x_ix_j$, where $x = (x_1, ..., x_n)$. So I can choose the basis in which we will work as the basis $B$. In this case, the sum reduces to $\sum_{i=1}^n \lambda_i y_i^2$, where $x = (y_1, ..., y_n)$ in the basis $B$, so that the result is trivial.
Is the above proof correct? I am finding it a bit fishy that I can just choose the basis and the quadratic form remains the same - I don't see why, after the basis change, the resulting value will be the same as if I had plugged in the vector in $q(x)$... Can someone clarify this part?
Thanks in advance!
I would use the matrix expression of $q(x)$ instead. You have that $q(x)=x^{\top}Ax$, and so if $u_i$ is an eigenvector for an eigenvalue $\lambda_i$ of $A$, then $q(u_i)=u_i^{\top}Au_i=\lambda_iu_i^{\top}u_i$, we can rescale $u_i$ so that $u_i^{\top}u_i=1$, and this would imply $q(u_i)=\lambda_i$. I am not sure if you can prove the result if you don't assume $u_i^{\top}u_i=1$.