I am having difficulty in understanding the above proof from "Linear Algebra Done Right" by Sheldon Axler.
- The way I have understood induction is "if it is true for n=1 and n & n+1, then it is true for all cases". But I could not explicitly see this approach in the above proof. Could someone throw some light on this?
- In the text above, I understood the part that if u is an eigenvector, then there exists another eigenvector that is from the orthogonal complement space. But I did not get what Axler means by "Adjoining u to orthonormal basis of $U^\bot$ gives an orthonormal basis of V consisting of eigenvectors of T". All I could get was u is an eigenvector and there is another eigenvector orthogonal to it. But how does it prove these form a complete basis of V? The previous theorem used here (7.27) says if T $\in$ L(V) is a self-adjoint operator, then T has an eigenvalue. But does it have enough eigenvectors to form the basis of V? What have I misunderstood? What does "adjoining u to ..." mean here?
