Convergence of sequence of functions to x^2

62 Views Asked by At

Given a function $g_n$ defined on $2^n$ subintervals of [0,2] such that for subinterval $[\frac{k-1}{2^{n-1}},\frac{k}{2^{n-1}}]$, $g_n(x) = (\frac{k-1}{2^{n-1}})^2.$ I am trying to show $g_n$ converges to $g(x) = x^2$. I have showed that for all intervals, the distance between $g_n(x)$ and $g(x)$ is at most $\frac{2k-1}{2^{2n-2}}$, which does converge to zero as the maximum value of $k$ is $2^n$. For choosing my $N$ value to show uniform convergence, however, I have ${N = \frac{log_2(\frac{2k-1}{\varepsilon})}{2}}-1.$ I am stuck about how to go about acknowledging $k$, if I even have to do so. I am learning this on my own- do I have to change anything, or am I able to leave $k$ as is? f