Extending A Function Periodically and Determining Whether Or Not It Converges

36 Views Asked by At

I have a question that I'm not sure I'm understanding correctly, and require some assistance. The question reads:

Consider the function defined on $[0,1)$: $$f(x)=\begin{cases}-1 & \text{ if } 0\leq x\leq \frac{1}{2}\\ 1 & \text{ if } \frac{1}{2}<x<1\end{cases}$$ Extend $f$ periodically with period $1$ to the entire real line. For each $n=1, 2, ...$, define $f_n(x)=f(nx)$. Now look at $f_n$ as a sequence of distributions. Does $f_n(x)$ converge as $n\rightarrow\infty$ in the sense of distributions? Now assume that $f$ is given by: $$f(x) = \begin{cases}-1 & \text{ if } 0\leq x\leq \frac{1}{2}\\ 5 & \text{ if } \frac{1}{2}<x<1\end{cases}$$ Repeat the above for this.

Now I know that for each $f_n(x)$, the distance of the period gets smaller and smaller, until it reaches $0$. So wouldn't the sequence $f_n$ for both cases just converge to $0$? If you integrated $f$ and took $n\rightarrow\infty$, the periods would get so small that the graph would technically turn into a vertical line, thus there would be no mass under it?

Am I approaching this question wrong? Is there something I'm not grasping?