In deriving $\int_{0}^{T}f(x)dx=\int_{a}^{a+T}f(x)dx$, why doesn't this imply $f$ is constant?

170 Views Asked by At

I was wondering if anyone can help me with this, if f(x) is a periodic function with period T then it satisfies $$\int_{0}^{T}f(x)dx=\int_{a}^{a+T}f(x)\;dx$$ for all $a \in \Bbb R$. It is clear that this must be true, but if you differentiate both sides with respect to $T$ do you not get $$f(T)=f(T+a)$$ and so because $$f(T+a)=f(a)$$ this implies that $$f(T)=f(a)$$ for all $a \in \Bbb R$, but does this not imply $f$ is constant? I am struggling to understand what is going wrong.

1

There are 1 best solutions below

0
On

The equality $\int_{0}^{T}f(x)dx=\int_{a}^{a+T}f(x)dx$ holds for all $a \in \Bbb R$, but only for values $T$ which are a period of $f$.

The derivative is only defined for functions defined on an interval. So what you actually have proved is that if the set of periods contains an open interval then $f$ is constant.