Given $f>0$ can we find $g$ such that $\int g>0$ but $\int fg\le 0$

142 Views Asked by At

My question is inspired by this one : Integral of the multiplication of two funtions

I wanted to find a counter example when $f>0$, but we have all latency we want for $g$.

Because when $f$ annulates or is negative on a small interval then it is easier to find a counter example by making $g^+,g^-,g^0$ disagree with $f$ values.


So here is the precise question :

  • given a function $f$ such that $f>0$ and continuous and $\displaystyle 0<\int_0^\infty f(t)dt<+\infty$
  • can we find $g$ such that $\displaystyle 0<\int_0^\infty g(t)dt<+\infty$ but $\displaystyle \int_0^\infty f(t)g(t)dt\le 0$ ?

There is no special requirements on $g$, it can be piecewise or we can smoothen it later.

My attempt at a solution is given below.

2

There are 2 best solutions below

1
On BEST ANSWER

Sketch: Since $\int_0^\infty f(t)dt<+\infty$, the measure of $[f\leq 1/n]$ for each $n$ is infinite. So for any $n$ we can find a closed set $S_n$ disjoint of $[0,1]$ with measure 1 i.e. $|S_n|=1$ such that $f\leq 1/n$ on $S_n$.

Let $m>0$ be the minimum of $f$ on $[0,1]$, define $g(x)= -\frac{2}{m},\,\forall x \in [0,1]$. Therefore $\int_{0}^{1}fg \leq -2$.

Now define $g=\sqrt{n}$ on $S_n$ and zero elsewhere.

Claim: We can find $n$ such that $\int_{0}^{\infty}fg \leq -1$ and $0<\int_{0}^{\infty}g <\infty$.

Proof: First notice $\int_{0}^{\infty}fg =\int _{0}^{1}fg +\int_{S_n}fg\leq -2+\frac{1}{\sqrt{n}}$

Also, $\int_{0}^{\infty}g =\int _{0}^{1}g +\int_{S_n}g=-\frac{2}{m}+\sqrt{n}$. Now, it is evident that such an $n$ exists. $\blacksquare$

2
On

Here is my attempt :

$\displaystyle F(x)=\int_0^x f(t)\;dt$ is a continuous and increasing function between $0$ and $\displaystyle I=\int_0^\infty f(t)dt$.

Suppose $\frac{F(x)}{x}$ is constant on $\mathbb R^+$ then there would exists some $a>0$ such that $F(x)=ax$.

But then $f(x)=F'(x)=a$ and $\displaystyle \int_0^\infty f(x)dx=+\infty$.

Thus there exists $(\alpha,\beta)\in\mathbb (R^{+*})^2$ such that

$\frac{F(\alpha)}{\alpha}\neq\frac{F(\beta)}{\beta}\tag{E}$


Assume $\alpha<\beta$ and let's define $g$ piecewise constant like below :

$\begin{cases} \forall x\in[0,\alpha[, & g(x)=c \\ \forall x\in[\alpha,\beta[, & g(x)=-1 \\ \forall x\in[\beta,+\infty[, & g(x)=0 \end{cases}$

$\begin{array}{l} 0\ge\displaystyle \int_0^\infty f(x)g(x)dx= \displaystyle \int_0^{\alpha}(c)f(x)dx+\int_{\alpha}^{\beta}(-1)f(x)dx+\int_{\beta}^{\infty} (0)f(x)dx \\ \iff cF(\alpha)-(F(\beta)-F(\alpha))+0\le 0 \end{array}$

But we also want $\displaystyle \int_0^\infty g(x)dx=c\alpha-(\beta-\alpha)>0$.

This is possible if $\quad\big(\frac{\beta-\alpha}{\alpha}\big)<c\le\big(\frac{F(\beta)-F(\alpha)}{F(\alpha)}\big)$

The double condition on $c$ requires that $\frac{F(\alpha)}{\alpha}<\frac{F(\beta)}{\beta}$

Instead we can reverse the intervals where $g$ is negative :

$\begin{cases} \forall x\in[0,\alpha[, & g(x)=-c \\ \forall x\in[\alpha,\beta[, & g(x)=1 \\ \forall x\in[\beta,+\infty[, & g(x)=0 \end{cases}$

The condition on $c$ become $\quad\big(\frac{F(\beta)-F(\alpha)}{F(\alpha)}\big)\le c < \big(\frac{\beta-\alpha}{\alpha}\big)$

And if requires that $\frac{F(\alpha)}{\alpha}>\frac{F(\beta)}{\beta}$


But according to $(E)$ one of these two situations always happens, and the choice of a convenient $c$ is not an issue since $c=\frac{u+v}{2}$ always agrees the inequality $u<c<v$.

So we have succeeded in building a function $g$ that answers the question.