I am trying to solve problems related to Cauchy-Schwarz Inequality, but I can't seem to understand why, after performing the inner product on the left side, we don't take the square root of it.
Cauchy's formula is $$\lvert \langle u,v\rangle\rvert \leq \lVert u\rVert \lVert v\rVert.$$
To find the norm of $u$, we first square the components of $u$, then add them, and finally take the square root of the resultant. Same for the norm of $v$. But for the left side, we just do the inner product. Doesn't the norm imply that the square root of the resultant value must be taken?
One way to remember and see it is that things need to stay homogeneous. (Yes, like in physics.)
Informal explanation:
More formally and more to the point, the inequality must remain true if you multiply $u$ by $\alpha u$, for any number $\alpha$. After all, $\alpha u$ is just another vector. Same thing replacing $v$ by $\beta v$. So we need $$ \lvert \langle \alpha u,\beta v\rangle\rvert \leq \lVert \alpha u \rVert\cdot \lVert \beta v \rVert \qquad \forall \alpha,\beta \in\mathbb{R} \tag{1} $$ But by (bi)linearity, the LHS is equal to $\lvert \langle \alpha u,\beta v\rangle\rvert = \lvert \alpha\beta \langle u, v\rangle\rvert = \lvert \alpha\beta \rvert \cdot \lvert \langle u, v\rangle\rvert$, while the RHS is equal (by properties of norms) to $\lVert \alpha u \rVert\cdot \lVert \beta v \rVert = \lvert \alpha\beta \rvert \cdot \lVert u \rVert\cdot \lVert v \rVert$. That's good! The factors $\lvert \alpha\beta \rvert$ cancel on both sides in (1).
If you had a square root in the LHS, they would not cancel, and (1) couldn't be true for all $\alpha,\beta$.