In a proof I am reading, Cauchy Schwarz is given to justify the following inequality:
$$\mathbb E[\sum_i a_ib_i]\le \big(\sum\mathbb E[a_i^2]\big)^{1/2}\big(\sum\mathbb E[b_i^2]\big)^{1/2}=:RHS$$
where $\{a_i,b_i\}$ is a finite collection of random variables.
I am having trouble seeing how exactly C-S is applied here. I am assuming C-S to be $$\sum a_ib_i\le (\sum a_i^2)^{1/2}(\sum b_i^2)^{1/2}$$
So taking the expectation gives (by monotonicity)
$$\mathbb E[\sum a_ib_i]\le \mathbb E[(\sum a_i^2)^{1/2}(\sum b_i^2)^{1/2}]$$
But in order to get RHS I am imagining having to use Jensen's inequality with the concave function $x\mapsto x^{1/2}$, then arguing that $a_i$'s and $b_i$'s are uncorrelated to separate the two expectations, which is not obvious.
Especially since they are probably not uncorrelated because $a_i=(W_i-W_{i-1})^2$ is the square of a Brownian increment, and $b_i=h(W_{i-1},W_i)$ is a function of the two random variables that make up the increment.
Consider this a little bit larger. Take $[1..n]\times\Omega$ as base space, $a$, $b$ are square-integrable functions on this space and the scalar product as $$ \langle a,b\rangle=\mathbb E[\sum_i a_ib_i]. $$ Then the cited inequality is just the bare Cauchy-Schwarz inequality for this scalar product.