I'm having problems with the following question from Evans' PDE book (From Section 8.7, Question 1b);
Fix $a,b \in \mathbb{R}$ and $0<\lambda <1$. Define $$u_k(x)= \begin{cases} a &\text{if} & j/k\leq x < (j+\lambda)/k \\ b &\text{if} & (j+\lambda)\leq x<(j+1)/k\end{cases} (j=0,\ldots,k-1)$$ Prove that $u_k$ converges to $\lambda a+ (1-\lambda)b$ weakly in $L^2(0,1)$.
Firstly I observed it reduces to the case $a=1,b=0$ by linearity. But now I just can't find any way to compare $\lambda\int g(x)\,dx$ and $\sum_{j=0}^{k-1} \int_{j/k}^{(j+\lambda)/k}g(x)\, dx$ (their difference converging to zero is what I need) - I've tried many things including integration by parts and changes of variables (clearly it's alright to just show it for smooth functions then use a density argument) but nothing seems to help.
Anyone got any good hints?
Hints.
We first observe that it suffices to consider step functions, as they are dense in $L^2[0,1]$. In particular, step functions $f$ for which there is an $N\in\mathbb N$, such that $f$ restricted in $[(j-1)/N,j/N]$ is constant, for every $j=1,\ldots,N$.
Next, observe that, for every $j,N\in\mathbb N$, with $0<1\le N$, $$ \lim_{k\to\infty}\int_{(j-1)/N}^{j/N} u_k=\frac{\lambda a+(1-\lambda)b}{N}. $$