If $f(x)=\int_{x-1}^x f(s)ds$, is $f$ constant? Periodic?

350 Views Asked by At

I was thinking of periodic functions, and in particular the following type of condition:

If a function $f:\mathbb{R}\to\mathbb{R}$ always "tends to its average", then it should be periodic.

To make things more formal, by "tending to the average" we could say something like $f(x)=\int_{x-1}^x f(s)ds$. This is only the average depending on a previous time interval of length $1$, but it seems an interesting enough property. However, the only type of functions which I could find that satisfies this property are the constant ones!

Question: If $f:\mathbb{R}\to\mathbb{R}$ is continuous (or more generaly measurable) and $f(x)=\int_{x-1}^x f(s)ds$ for (almost) every $x\in\mathbb{R}$, then is $f$ constant (a.e.)? Periodic (a.e.)?

Here is a first try for $C^1$ functions (see edit below!): If $f$ is $C^1$ and $x$ is fixed, we can use Taylor expansion $f(s)=f(x)+O(s-x)$ (and similarly for $x-1$) to obtain \begin{align*} f(x+t)-f(x)&=\int_x^{x+t}f(s)ds-\int_{x-1}^{x-1+t}f(s)ds\\ &=\int_x^{x+t}f(x)+O(s-x)ds-\int_{x-1}^{x-1+t}f(x-1)+O(s-x+1)ds\\ &=t(f(x)-f(x-1))+O(t^2), \end{align*} so $f'(x)=f(x)-f(x-1)$. This is obviously true if $f$ is constant, but the converse is not clear to me at the moment.


Edit: From a comment and answer below, the equation $f'(x)=f(x)-f(x-1)$ has non-periodic solutions on $\mathbb{R}\setminus\mathbb{Z}$, so this should not be the way to go for $C^1$ functions. However, even in this case it is not clear that any solution of this equation will satisfy $f(x)=\int_{x-1}^x f(s)ds$, which is the question: All I can obtain, in principle, is $f(x)-f(x-1)=\int_x^{x-1}f(s)ds-\int_{x-2}^{x-1}f(s)ds$.

4

There are 4 best solutions below

2
On BEST ANSWER

Here is a typical argument using characteristic equation:

Let $\alpha \in \mathbb{C}\setminus\{0\}$ solve the equation $\alpha = 1-e^{-\alpha}$. One can indeed prove that such solution exists. One such solution is numerically given by $-2.08884 + 7.46149 i$. Then

$$ \int_{x-1}^{x} e^{\alpha t} \, \mathrm{d}t = \frac{1 - e^{-\alpha}}{\alpha} e^{\alpha x} = e^{\alpha x}, $$

hence $f(x) = e^{\alpha x}$ is one (complex-valued) solution of the equation

$$ f(x) = \int_{x-1}^{x} f(t) \, \mathrm{d}t \tag{*}$$

If one is interested in real-valued solutions only, then one can consider both the real part and the imaginary part of $e^{\alpha x}$. In particular, this tells that there exists an analytic solution of $\text{(*)}$ which is neither constant nor having real-period.


Addendum. We prove the following claim:

Claim. There exists a non-zero solution of $\alpha = 1 - e^{-\alpha}$ in $\mathbb{C}$.

Proof. We first note that $\varphi(x) = x(1-\log x)$ satisfies $\varphi(0^+) = 0$ and $\varphi(1) = 1$. Next, let $k$ be a positive integer. Then

  1. There exists $y \in (2k\pi, (2k+\frac{1}{2})\pi)$ such that $ \varphi(\sin(y)/y) = \cos (y) $, by the intermediate-value theorem.

  2. Set $x = \log(\sin(y)/y)$.

We claim that $ \alpha = x + iy $ solves the equation. Indeed, it is clear that $ e^{-x}\sin y = y $ holds. Moreover,

$$ (1-x)e^x = \varphi(\sin(y)/y) = \cos(y), $$

and so, $1 - x = e^{-x}\cos(y)$. Combining altogether,

$$ 1 - \alpha = 1 - x - iy = e^{-x}\cos(y) - ie^{-x}\sin(y) = e^{-x-iy} = e^{-\alpha}. $$

Therefore the claim follows. ////

(A careful inespection shows that this construction produces all the solutions of $\alpha = 1 - e^{-\alpha}$ in the upper half-plane.)

3
On

Define $f(x)=3x^2-4x+1$ for $x\in(0,1)$ so the differential equation is true for $x=1$.
For $x\in(1,2)$, solve the differential equation $$\frac{df}{dx}=f(x)-(3(x-1)^2-4(x-1)+1)$$ Iterate the procedure, for $x\in (2,3)$ and so on.

0
On

Let $$f(x) = \dfrac{a_0}2+\sum\limits_{n=1}^\infty\left(a_n\cos2\pi nx + a_n\sin2\pi nx\right),$$ then $$\int\limits_{x-1}^x f(x)\,\mathrm dx = \dfrac {a_0}2,$$ so $f(x)$ is a constant.

0
On

This is a partial answer where I provided some properties for $f$ that complies with $$ f(x)=\int_{x-1}^xf(s)\,{\rm d}s $$ for all $x\in\mathbb{R}$. I will update this thread as I make any further progress.

1. $f\in C^{\infty}(\mathbb{R})$ if $f\in L_{\rm loc}^1(\mathbb{R})$ ($f$ must be smooth if it is locally integrable).

Proof. $\forall\,x_0\in\mathbb{R}$, $\forall\,x\ge x_0+1$, we have $$ f(x)=\int_{x_0}^xf(s)\,{\rm d}s-\int_{x_0}^{x-1}f(s)\,{\rm d}s. $$ Since $f\in L_{\rm loc}^1(\mathbb{R})$, it follows that $$ \int_{x_0}^xf(s)\,{\rm d}s\quad\text{and}\quad\int_{x_0}^{x-1}f(s)\,{\rm d}s $$ are absolutely continuous. Thus $f$ is absolutely continuous on $\left(x_0+1,\infty\right)$. The arbitrariness of $x_0$ implies that $f$ is absolutely continuous on $\mathbb{R}$. Thus necessarily, $f\in C(\mathbb{R})$.

Likewise, since $f\in C(\mathbb{R})$, it follows that $$ \int_{x_0}^xf(s)\,{\rm d}s\quad\text{and}\quad\int_{x_0}^{x-1}f(s)\,{\rm d}s $$ are continuously differentiable, which leads to $f\in C^1(\mathbb{R})$.

Repeat the above reasoning inductively, and we eventually obtain $f\in C^{\infty}(\mathbb{R})$.$\#$

This conclusion suggests that, at least for a most general case, we shall only consider those $f$'s that are smooth on $\mathbb{R}$.

2. $f=0$ if $f\in L^1(\mathbb{R})$ ($f$ must be zero if it is integrable on $\mathbb{R}$).

Proof. Since $f\in L^1(\mathbb{R})$, it is obvious that $$ \int_{x-1}^xf(s)\,{\rm d}s=\int_{\mathbb{R}}1_{\left[0,1\right]}(x-s)f(s)\,{\rm d}s=\left(1_{\left[0,1\right]}*f\right)(x). $$ Hence, the original relation is equivalent to the following convolution equation on $\mathbb{R}$: $$ f=1_{\left[0,1\right]}*f. $$

Note that $f\in L^1(\mathbb{R})$, and its Fourier transform $\hat{f}$ is well-defined. By the convolution theorem, $$ \hat{f}=\widehat{1_{\left[0,1\right]}}\,\hat{f}\Longrightarrow\left(1-\widehat{1_{\left[0,1\right]}}\right)\hat{f}=0. $$ This implies that $\hat{f}=\hat{f}(\xi)=0$ for all $\xi\ne 0$. Besides, the continuity of $\hat{f}$ yields $\hat{f}(0)=0$. Consequently, we have $$ \hat{f}=0\iff f=0.\# $$

This conclusion suggests that any non-trivial solution to the original equation must be non-integrable on $\mathbb{R}$, e.g., non-zero constants. Nevertheless, given that it is reasonable to assume $f$ to be locally integrable, the original equation can always be formulated in the convolution form $f=1_{\left[0,1\right]}*f$. Just note that $\hat{f}$ is not defined and the convolution theorem no longer applies if $f$ is locally integrable but not integrable on $\mathbb{R}$.

3. $f\equiv\text{const}$ if $f\in L_{\rm loc}^1(\mathbb{R})\cap C_T(\mathbb{R})$ ($f$ must be constant if it is locally integrable and $T$-periodic).

Proof. Since $f\in L_{\rm loc}^1(\mathbb{R})$, we have $f\in C^{\infty}(\left[0,T\right])$. Thanks to the periodicity, $f$ observes its Fourier series on $\left[0,T\right]$ $$ f(x)\sim\sum_{n\in\mathbb{Z}}a_n\,e^{\frac{2i\pi nx}{T}}. $$

Since $f\in C^{\infty}(\left[0,T\right])$, the Fourier series of $f$ converges absolutely and uniformly to $f$. Therefore, we have for one thing, $$ f(x)=\sum_{n\in\mathbb{Z}}a_n\,e^{\frac{2i\pi nx}{T}}. $$ For another, \begin{align} \int_{x-1}^xf(s)\,{\rm d}s&=\int_{x-1}^x\sum_{n\in\mathbb{Z}}a_n\,e^{\frac{2i\pi ns}{T}}{\rm d}s\\ &=\sum_{n\in\mathbb{Z}}a_n\int_{x-1}^xe^{\frac{2i\pi ns}{T}}{\rm d}s\\ &=\sum_{n\in\mathbb{Z}}a_n\frac{1-e^{-\theta_n}}{\theta_n}e^{\frac{2i\pi nx}{T}}, \end{align} where $\theta_n=2i\pi n/T$, and $\left(1-e^{-\theta_0}\right)/\theta_0=1$ (this is defined so that the form of the series preserves; otherwise, one may carry out the integration separately for $n=0$ and $n\ne 0$, and may find the results identical).

Thanks to this result, the original equation requires $$ a_n=a_n\frac{1-e^{-\theta_n}}{\theta_n}\iff\left(1-\frac{1-e^{-\theta_n}}{\theta_n}\right)a_n=0. $$ Note that $$ 1-\frac{1-e^{-z}}{z}=0 $$ yields only one solution on $i\mathbb{R}$ (the imaginary axis), i.e., $z=0$. Thus since $\theta_n\ne 0$ for all $n\ne 0$, it is a must that $a_n=0$ for all $n\ne 0$. This leads to $f(x)=a_0$, i.e., $f$ is constant on $\mathbb{R}$.$\#$

This conclusion suggests that any continuous periodic function that satisfies the original equation must be constant.

[TBC...]

[Following @Empy2's answer, I believe the existence of some non-periodic solution to the original equation. Yet as per the above properties, this solution has to be smooth and most likely unbounded. Trying the polynomial series $f(x)=\sum_{n=0}^{\infty}a_nx^n$ could be promising, but it leads to an infinite dimensional linear system, and its convergence also remains unknown, which challenges the commutativity of summation and integration...]