$f_1, f_2 : \mathbb{R} \rightarrow \mathbb{R}$ nonconstant, continuous, with period $1, \sqrt{2}$, respectively, then $f_1 + f_2$ is not periodic

178 Views Asked by At

I've been working on this problem for several hours, but I keep getting stuck. Suppose $f_1, f_2 : \mathbb{R} \rightarrow \mathbb{R}$ periodic with period $1, \sqrt{2}$, respectively, and that each of $f_1, f_2$ is nonconstant, continuous. Then $f_1 + f_2$ is not periodic.

My thoughts so far:

Suppose $p$ is the period of $f_1 + f_2$. I want to use the fact that $\{n\sqrt{2}\}$ is dense in $\mathbb{R}/(x \sim x+p)$ to come up with a contradiction. I've tried numerous paths, which would take forever to write up. Now I'm just hoping to find the solution. I can provide more of my attempt if necessary.

1

There are 1 best solutions below

0
On

Let us show the following property : if $f_1$ has period $T_1 > 0$, $f_2$ has period $T_2 > 0$, $f_1$ and $f_2$ are both continuous and nonconstant, and $(T_1,T_2)$ is linearly independent over $\mathbb Q$, then $f=f_1+f_2$ cannot be periodic.

Your question is the case $T_1=1,T_2=\sqrt{2}$.

Suppose by contradiction that $f$ has period $T>0$. If $T$ is a rational multiple of $T_1$, then there are positive integers $a$ and $b$ such that $aT=bT_1$. Let us call $S$ the number appearing on both sides of this equality. Then $f$ and $f_1$ are both $S$-periodic ; so is $f_2=f-f_1$. Then $f_2$ has two ${\mathbb Q}$-linearly independent periods and is continuous, which is possible only if $f_2$ is constant, contradicting the hypothesis.

So $T$ is not a rational multiple of $T_1$. Similarly, $T$ is not a rational multiple of $T_2$.

By the property shown in this MSE question, we deduce that

$$ \lim_{n\to+\infty}\frac{\displaystyle \sum_{k=1}^{n} f(x+kT_1)}{n}= \int_{0}^{T} f(t) dt \tag{1} $$

But $f=f_1+f_2$ and

$$ \frac{\displaystyle \sum_{k=1}^{n} f_1(x+kT_1)}{n}=f_1(x) , \ \ \lim_{n\to+\infty}\frac{\displaystyle \sum_{k=1}^{n} f_2(x+kT_1)}{n}= \int_{0}^{T_2} f_2(t) dt \tag{2} $$

This implies that

$$ \int_{0}^{T} f(t) dt = f_1(x)+ \int_{0}^{T_2} f_2(t) dt \tag{3} $$

which contradicts the fact that $f_1$ is nonconstant. This finishes the proof.