Is g with $\cosh(xg(x))=x\cosh(g(x))$ decreasing?

143 Views Asked by At

Let $g:\mathbb{R}_{>0} \to \mathbb{R}_{>0}$ be defined implicitly by $\cosh(xg(x))=x\cosh(g(x))$ and $g(1)\sinh(g(1))=\cosh(g(1))$. How to show that $g$ is differentiable? Furthermore, is it true that $g$ is monotone decreasing?

Assuming that $g$ is differentiable, I showed that $g'\mid_{]0,1[}<0$ is equivalent to $g\mid_{]1,\infty[}<g(1)$ but I wasn't able to show much more.

4

There are 4 best solutions below

1
On

Consider the function $f(x,y) = \cosh(xy) - x \cosh(y)$. Then $g$ is the implicit function defined by $f(x,g(x)) = 0$. Now you have to show that $D_2 f(x,g(x)) \neq 0$ to apply the implicit function theorem, and you can locally conclude that $g$ is differentiable and using the formula $g'(x) = [D_2 f(x,g(x))]^{-1} D_1 f(x,g(x))$ you can check if $g'(x) < 0$.

I've not done all calculations, so this is only a rough hint.

2
On

Here, in a complementary approach to "classical tools" of analysis as @eldering has used, I propose a geometrical view of the defining relationship of $g$, which will provide us a graphically intuitive proof of the fact that $g$ is a decreasing function.

Let us begin by the observation that for any $x,y >0$ :

$$\cosh(xy) = x \cosh(y) \ \ \Leftrightarrow \ \ \dfrac{x}{cosh(x)}=\dfrac{xy}{cosh(xy)} \ \ \ (1)$$

Thus, if we set $\varphi(t):=\dfrac{t}{cosh(t)}$, (1) can be written

$$\varphi(x)=\varphi(xy) \ \ \ (2)$$

By observation of the curve of $\varphi$ (strictly increasing from the origin to a certain maximum $M(1.1997,0.6627)$, and then decreasing asymptoticaly to $0$ when $x \rightarrow \infty$, we see that, for a given $x$, i.e., for a given level $\varphi(x)$, there exist a unique point at the same level ; property (2) forces its abscissa to be $xy$, defining value $y$ attached to $x$. Thus $g$ is well defined for all $x \in (0,+\infty)$.

Therefore, if we take 2 values of $x$ such that $0<x_1<x_2$, the corresponding values are ranked in the order $x_1y_1>x_2y_2$ (see graphics).

Rewriting these inequalities under the form $\dfrac{x_2}{x_1} > 1 \ \ $ and $\dfrac{x_1y_1}{x_2y_2}>1$, and multiplying their LHS, we obtain $\dfrac{y_1}{y_2}>1$.

Thus: $0<x_1<x_2 \Rightarrow y_1>y_2>0.$

proving that $g:x \rightarrow y$ is a decreasing function.

The curve of $\varphi(t):=\dfrac{t}{cosh(t)}$ allowing to define function $g$ : enter image description here

1
On

Here's an answer, if we suppose continuity of $g$ in $1$. Also, I assume that you have already checked that $g$ is well-defined, i.e.: for every $x\neq 1$ there's unique $g(x)$ fulfilling the equality. If not, consider functions $$ \cosh ay - a\cosh y, \quad y>0 $$ with parameter $a\neq 1$ and examine its derivative and limits in $0$ and $\infty$; this is pretty straightforward. You might need to include it in your solution, depending on the formulation of the original problem.

Back to the point. By the implicit function theorem, $g$ is $C^\infty$ for $x\neq 1$. The derivative is (I often write $g$ instead of $g(x)$ for brevity) $$ g'(x) = \frac {\cosh g - g\sinh xg} {x(\sinh xg - \sinh g)} = \frac {\sinh xg}{x^2} \cdot \frac{\coth xg - xg} {\sinh xg - \sinh g} $$ The first fraction is always positive. For the second:

  • $\sinh xg - \sinh g >0$ for $x>1$,
    • $"<0"$ for $x<1$,
  • $\coth xg - xg > 0$ for $x$ such that $xg < g(1)$,
    • and respectively for $"<"$, $"="$, because $g(1)$ is the solution to $\coth y = y$ and $\coth y - y$ is decreasing.

Consider $x>1$. Then $g'(x)<0$ if $xg(x)>1$ and vice versa. We prove that $g'(x)<0$ for $x>1$ in two steps.

  1. If $g'(x_0)<0$, then $g'(x)<0$ for $x>x_0$.

Let's take the longest interval possible with $g'<0$: $$ z:= \sup \{y: g'(x)< 0 \text{ for } x\in [x_0, y)\} $$ If $z=\infty$, then we're done here. Suppose not: then $g'(z)=0$, because $g'$ is continuous. Therefore

  • $xg(x)>g(1)$, for $x\in [x_0, z)$,
  • $zg(z)=g(1)$,

so the function $xg(x)$ is decreasing at least just before $z$. But let's calculate the derivative: $$ (xg(x))'\mid_{x=z} = (g(x) + xg'(x))\mid_{x=z} = g(z) + 0 = g(z)>0, $$ and we arrive at a contradiction.

This could be reasoned more neatly using the mean value theorem: $(xg(x))'\mid_{x=z} > 0$, so $(xg(x))' >0$ on some $(z',z)$ with $z'>x_0$, which contradicts the theorem for $xg(x)$ on $(z',z)$.

Edit: It follows that $g$ on $(1, \infty)$ is either decreasing or increasing up to some point and then decreasing. This means that $g$ has right-sided limit in $1$, but possibly infinite. Similarly for $(0,1)$. If you use remark 3. below, it turns out that to show continuity, it suffices for $g$ and $g'$ to be bounded around $1$.

  1. If $g'(x_0)>0$, then $g'(x)<0$ for some $x\in(1,x_0)$.

Then $x_0 g(x_0) <g(1)$, so $g(x_0) < g(1)/x_0 < g(1)$. By the mean value theorem, there is $x\in(1,x_0)$, for which $g'(x) = (g(x_0) - g(1))/(x_0-1) < 0$. (This is where I use continuity.)

So if there was $g'(x_0)>0$, then $g'(x)<0$ for some $x\in(1,x_0)$, but then by (1.) $g'(x_0)<0$.

Similarly, it can be shown that for $x_0\in (0,1)$:

  1. If $g'(x_0)<0$, then $g'(x)<0$ for $x<x_0$.
  2. If $g'(x_0)>0$, then $g'(x)<0$ for some $x\in(x_0,1)$.

The proof is finished.

As for the continuity:

  1. For what it's worth, Wolfram|Alpha shows that it is continuous: solution of $\cosh(xy) = x \cosh y$ in $(0,2)$, solution around $(1,g(1))$ (some artifacts arise in these pictures, but I don't think we should be worried).
  2. Maybe start with showing that $g$ is bounded around $1$ and then that any sequence $g(x_n)$ for $x_n\to 1$ has subsequence convergent to $g(1)$; then this must be the limit.
  3. Try to examine the function $g(1+x)$ for $x=0$, i.e. start with the equality $\cosh ((1+x)g(1+x)) = (1+x)\cosh(g(1+x))$. Using formula for $\cosh(x+y)$, I transformed it into: $\coth(g(1+x)) = \frac {-\sinh(xg(1+x))}{\cosh(xg(1+x)) - 1 - x}$. Then, using L'Hôpital's rule, I got that if $l = \lim_{x\to 0} g(1+x)$ exists and $g'(1+x)$ is bounded around $0$, then it satisfies $\coth(l)= l$.
  4. What put me on the right track was approximating $\cosh y$ with $e^y$ for large $y$. This doesn't work well around $1$, but maybe some finer approximation? Here's what I got:
    Since $\frac 1 2 e^y < \cosh y < e^y$, for $x>1$: \begin{gather} \frac 1 2 e^{xg} < xe^g\\ xg < \log 2 + \log x + g\\ g < \frac {\log 2x}{x-1}. \end{gather} On the other hand: \begin{gather} e^{xg} > \frac 1 2 xe^g\\ xg > \log x - \log 2 + g\\ g > \frac {\log x/2}{x-1}. \end{gather} Unfortunately, for $x\to 1$ this yields $\lim g(x) \in (-\infty, +\infty)$.
0
On

This is a global problem, insofar as we also have to establish the existence of the function $$y=g(x)\qquad(x>0)\tag{1}$$ solving the given equation $$f(x,y):=\cosh (xy)-x\cosh y=0\qquad(x>0, \ y>0)\ .\tag{2}$$ Note that the equation $(2)$ is solved not only by some "interesting" function $(1)$, but also by all points on the half-line $y\mapsto(1,y)$ $(y>0)$. As a consequence $g(1)$ is only obtained via some exception handling.

Consider the family of (half-)catenaries $$\gamma_x:\quad u={1\over x}\cosh (x\,y)\qquad(x>0)$$ in the first quadrant of an auxiliary $(y,u)$-plane. These $\gamma_x$ are all similar and lying similarly with respect to the origin. They are all tangent to a certain line $u=my$, which is at the same time the envelope of the family. The following figure shows three of these catenaries, whereby to a larger $x$ corresponds a narrower catenary.

enter image description here

Inspection of the figure shows that any two different catenaries $\gamma_a$, $\gamma_b$ intersect transversally in exactly one point, which is obtained by solving $${1\over a}\cosh(ay)={1\over b}\cosh(by)\tag{3}$$ for $y$. In addition the implicit function theorem, applied to $(3)$, guarantees that for $a_0\ne b_0$ the obtained $y$ is a differentiable function of $(a,b)$ in the neighborhood of $(a_0,b_0)$.

Now the given equation $(2)$ means that we should intersect the catenary $\gamma_x$ with the "reference catenary" $\gamma_1$. Solving $(3)$ with $(a,b):=(x_0,1)$ gives a unique $y_0=:g(x_0)$; furthermore the function $x\mapsto g(x)$ thus constructed is undefined at $x=1$, and is differentiable in ${\mathbb R}_{>0}\setminus\{1\}$. But there is more in the above figure! Assume that the red catenary is $\gamma_1$, and that $1<x_1<x_2$. Then the blue catenary is $\gamma_{x_1}$, and the green catenary is $\gamma_{x_2}$. From the figure it is then obvious that $g(x_2)<g(x_1)$. Similarly: If we let the green catenary be $\gamma_1$ and assume that $1>x_1>x_2$ then we can see that $g(x_1)<g(x_2)$. This shows that $g$ is monotonically decreasing both on $\>]0,1[\>$ and on $\ ]1,\infty[\>$.

It remains to consider the value $x=1$. Interpreting the figure dynamically we see that when $x\to1$ the point $\gamma_x\wedge\gamma_1$ converges to the point where $\gamma_1$ touches the envelope $u=my$. Doing the calculations one should find the value $g(1)$ defined in the question.