Suppose $f(t)=h(t)g(t)$ is an odd analytic function, such that $h(t)$ is increasing and $g(t)$ is decreasing, and such that $\lim_{t\to\infty} f(t)=0$ and $f(t)>0$ for $t>0$, and $f'(0)>0$. For example, $f(t)=te^{-t^2}$. Now formulate,
$$f(t,x):=\sum_{m=-\infty}^\infty (-1)^m f\left(t+\frac{m}{x}\right)$$
It appears to be the case that
$$ \frac{1}{2x}|f(t,x)| \ge \left|f\left(\frac{1}{2x},x\right)\right|\cdot t $$
for $t\in [0,1/2x]$. I am having trouble proving the above inequality (if it is even true). There are several properties to note about $f(t,x)$. $f(t,x)$ is periodic in $t$ with period $1/x$. Likewise, $f(t,x)$ is symmetric in $t$ on each periodic interval, i.e., $f(t,x) = f(1/x-t,x)$. It does appear to be the case that the absolute values are required, for example, $f(t)=t\cdot e^{-\cosh(t)}$ and $x=1/5$. In particular, it seems that if $f(1/2x,x)=0$, then $f(t,x)=0$ for $t\in[0,1/2x]$.
Is there a name for the type of series formulated above? Are the properties I am seeing an illusion?