$f(\alpha x) = f(x)^{\beta}$ under different constraints

387 Views Asked by At

With $\alpha > 0,\, \beta \in \Bbb R^*,\, \alpha, \beta \neq 1$ and $f : \Bbb R \to \Bbb R_+^*$, let's consider the functional equation $$ f(\alpha x) = f(x)^{\beta} \tag{$\Xi$}$$ or equivalently $g(\alpha x) = \beta g(x)$ for $g = \ln f$.

The case where $\alpha = \sqrt2$, $\beta = 2$ and $f \in \mathcal C^2$ has already been solved here : Solving $(f(x))^2 = f(\sqrt{2}x)$ (the answer is $\exists \lambda\mid f(x) = e^{\lambda x^2}$).

What if we relax/change some of the constraints, for instance:

  • Keeping $f$ regular (say $\mathcal C^{\infty}$) but setting $\alpha, \beta$ generic
  • $f \in \mathcal C^0$
  • $f \in L^1$
  • (other ideas?)
2

There are 2 best solutions below

1
On BEST ANSWER

Summary at the bottom.


$$g(\alpha x)=\beta g(x)$$

$$\gamma = \frac{\ln |\beta|}{\ln\alpha}$$


If $\beta >0$, $\beta=\alpha^\gamma$ and $g(\alpha x)=\alpha ^\gamma g(x)$.

We define $h(x)$, for $x\neq 0$, as $h(x)= g(x)x^{-\gamma} $. Then: $$h(\alpha x)=g(\alpha x){\alpha^{-\gamma} x^{-\gamma}}=g(x){x^{-\gamma}}=h(x) $$

Now let $k_1(x)=h(\alpha^x)$ and $k_2(x)=h(-\alpha^x)$.

Then $k_1$ and $k_2$ can be any periodic functions, with a period of $1$.

Therefore:

$$ g(x)=\cases{x^\gamma k_1(\log_\alpha(x)) & if $x>0$ \cr x^\gamma k_2(\log_\alpha(-x)) & if $x<0$ } $$


If $\beta<0$, $\beta=-\alpha^\gamma$. We use the same definition for $h$, $k_1$ and $k_2$, but now we have: $k_1(x+1)=-k_1(x)$ and $k_2(x+1)=-k_2(x)$.

So $k_1$ and $k_2$ can be any antiperiodic functions, with a period of $1$.

And:

$$g(x)=\cases{x^\gamma k_1(\log_\alpha(x)) & if $x>0$ \cr x^\gamma k_2(\log_\alpha(-x)) & if $x<0$ }$$


If $g \in \mathcal C^n$, then $k_1$, $k_2 \in \mathcal C^n$.

If $n\geq\gamma$, then in a neighborhood of $0^+$: $$g(x)=\sum\limits_{k=0}^{n}\frac{g^{(k)}(0)}{k!}x^k + o(x^n)= x^\gamma k_1(\log_\alpha(x))$$ $$k_1(\log_\alpha(x))=\sum\limits_{k=0}^{n}\frac{g^{(k)}(0)}{k!}x^{k-\gamma} + o(x^{n-\gamma})$$

So $k_1(\log_\alpha(x))$ has a limit at $0^+$ (which could be infinite), so $k_1$ has a limit at $-\infty$. So it must be a constant (since it is periodic of period at most $2$). We can use the same reasoning for $k_2$. So $g(x)=\cases{c_1x^\gamma & \text{if } x>0 \cr c_2x^\gamma & \text{if } x<0 } $

Therefore, all the derivatives of $g$ at $0$ of order smaller than $\gamma$ are $0$, and if $n>\gamma$, then $g$ is not $p=\lceil \gamma\rceil$ times differentiable near $0$. So $\gamma = n$ and $c_1=c_2$.


To sum it up:

In general:

$$g(x)=\cases{x^\gamma k_1(\log_\alpha(x)) & if $x>0$ \cr x^\gamma k_2(\log_\alpha(-x)) & if $x<0$ } $$

Where $k_1$ and $k_2$ are two periodic (if $\beta>0$) or antiperiodic (if $\beta<0$) functions, of period $1$.

If $g\in\mathcal C^{n}$, so are $k_1$ and $k_2$. And if $n\geq\gamma$, $n = \gamma$, and $g(x)=\lambda x^n$.

If there are any mistakes, please let me know.

0
On

Assume $f$ (and $g$) is $\mathcal C^{\infty}$ (we will see that $\mathcal C^p$ with $p$ large enough is sufficient to get the same result).

Derivating the equality $g(\alpha x) = \beta g(x)$ $k$ times gives $\alpha^k g^{(k)}(\alpha x) = \beta g^{(k)}(x)$.

Let's assume first that $\alpha, \beta > 1$. Replacing $x$ by $(x/\alpha)$ $n$ times in the last equality yields

$$g^{(k)}(x) = \frac{\beta}{\alpha^k}g^{(k)}\left(\frac x{\alpha}\right) = \ldots = \left(\frac{\beta}{\alpha^k}\right)^n g^{(k)}\left(\frac x{\alpha^n}\right) \tag{$\phi$}$$

Now let's pick $k = \left\lceil \frac{\ln \beta}{\ln \alpha} \right\rceil$ so that $\left|\frac{\beta}{\alpha^k}\right| \le 1$ and $\frac{\beta}{\alpha^{k-1}} > 1$. Since $g^{(k)}$ is continuous in $0$, taking the limit in $(\phi)$ when $n \to \infty$ commands $g^{(k)}(x) = 0$ for all $x$. So $g$ must be a polynomial of degree $\le k$.

If we replace $k$ with $p \le k-1$ in $(\phi)$, the limit when $n \to \infty$ now commands $g^{(p)}(0) = 0$ since $\frac{\beta}{\alpha^{p}} \ge \frac{\beta}{\alpha^{k-1}} > 1$. That implies that $g(x)$ is of the form $g(x) = \lambda x^k$. But $g(\alpha x) = \lambda \alpha^k x^k= \beta g(x) = \beta \lambda x^k$ forces $\beta = \alpha^k$.

Now if we assume $\alpha, \beta < 1$, it is obvious that $(\Xi)$ is equivalent to $g(\tfrac1{\alpha}x) = \tfrac1{\beta}g(x)$ and since $\tfrac1{\alpha}, \tfrac1{\beta} >1$, we are back in the former case.

Wrap-up: if $\alpha, \beta>1$ or $\alpha, \beta<1$, noting $k = \frac{\ln \beta}{\ln \alpha}$, $(\Xi)$ only has $\mathcal C^{\lceil k \rceil}$ solutions if $k \in \Bbb N$ and in this case the solutions are of the form $f(x) = e^{\lambda x^k}$.

Let's now assume $\alpha >1$ and $0 < \beta <1$, with $f$ (and $g$) only $\mathcal C^0$ (continuous). Rewriting $(\Xi)$ as $g(x) = \beta g(\tfrac x{\alpha})$, we get by iterating $\forall n, g(x) = \beta^n g(\tfrac x{\alpha^n})$ and, by continuity of $g$ in $0$, $g(x) = 0$. As earlier, the case $\alpha < 1 $ and $\beta >1$ has the same solution.

Wrap-up 2: if $\alpha >1, 0<\beta<1$ or $\alpha <1, \beta>1$, the only $\mathcal C^0$ (even bounded) solution to $(\Xi)$ is $f = constant$.


(source: mit.edu)