I know that in the Fourier series expansion for $f$, we have
$$f(t) = \sum_{n=-\infty}^{\infty} c_n \exp\left(\frac{2\pi int}{T}\right)$$
where the lowest frequency term (ignoring the constant) has a period of $T$, and this can be proven.
I'm wondering if the following statement is true, and how one may prove it:
Consider a function $f$ that is periodic with period $T$. Say we represent $f$ in the form
$$f(t) = \sum_{m \in \mathbb{R}}\sum_{n=-\infty}^{\infty} c_{n,m} \exp\left(inmt\right) \tag{1}$$
then for the set of all possible values assigned to the constants $c_{n,m}\ \forall\ n \in \mathbb{Z},\ m \in \mathbb{R}$ which satisfy (1), $c_{n,m} = 0\ \forall\ m > \frac{2\pi}{T}$.
What I'm trying to say is that if we represent a periodic function $f$ as the sum of arbitrary sinusoids, we can never have a sinusoidal term which has a period greater than the period of $f$. No matter how we choose all the other sinusoids, we can never "cancel out" the fact that a sinusoid with a period greater than $T$, (call this sinusoid $g$), we can always find a $t$ such that $g(t) \neq g(t+T)$.