Can $\log(1-U)-\log(U)+W$ be normally distributed, with $U$ uniform on $(0,1)$ and $W$ independent of $U$?

1k Views Asked by At

Assume that $U$ and $V$ are independent random variables with values in $(0,1)$ and that $U$ is uniformly distributed. Can it happen that $$L=\log\left(\frac{(1-U)V}{U(1-V)}\right)$$ is normally distributed?

As a motivation, note that $L$ is the log odds ratio of two binary random variables with Bernoulli distributions of random parameters $U$ and $V$, and that the question above arose from discussions here, where the suggestion that no such distribution of $V$ exists, was made.

This can also be formulated in terms of PDFs or in terms of characteristic functions. First, computing the PDF of $\log((1-U)/U)$, one arrives at the equivalent formulation:

In terms of PDFs: Consider some random variable $X$ with PDF $$f_X(x)=\frac{e^x}{(e^x+1)^2}$$ on the real line, does there exist any random variable $Y$ independent of $X$ such that $$Z=X+Y$$ is normally distributed?

Finally, the characteristic function of $X$ is $$\varphi_X(t)=E(e^{itX})=\frac{\pi t}{\sinh(\pi t)}$$ hence one is also asking the following:

In terms of characteristic functions: Determine if there exists any positive $v$ such that $g_v$ is a characteristic function, where $$g_v(t)=\frac{\sinh(t)}t\,e^{-vt^2}$$

Expansions at $t=0$ show that $g_v$ can be a characteristic function only if $v\geqslant\frac16$.

3

There are 3 best solutions below

0
On BEST ANSWER

Consider some solution $Z=X+Y$, then the identity $$e^Z=e^X\cdot e^Y$$ involves only positive random variables hence the independence of $(X,Y)$ implies the identity in $(0,+\infty]$ that $$E(e^Z)=E(e^X)\cdot E(e^Y)$$ Now, $c=E(e^Z)$ is finite since $Z$ is normal, and $E(e^X)=+\infty$ because $e^xf_X(x)\to1$ when $x\to+\infty$ hence $x\mapsto e^xf_X(x)$ is not integrable on the real line. But the equation $$c=+\infty\cdot b$$ has no solution $b$ in $(0,+\infty]$, hence there is no random variable $Y$ independent of $X$ such that $Z=X+Y$ is normal.

This approach really proves a more general result:

Consider two distributions $\mu$ and $\nu$ such that $\int_\mathbb Re^xd\mu(x)$ is infinite and $\int_\mathbb Re^xd\nu(x)$ is finite. Then if $P_X=\mu$, there exists no $Y$ independent of $X$ such that $P_{X+Y}=\nu$.

3
On

There is no random variable $Y$ on $\mathbb{R}$, independent of $X$, for which $Z:=Y+X$ is normally distributed.

Notation

Capitalization denotes Fourier transformation, e.g. the Fourier transform of $f(t)$ is $$ F(s) := \int_{-\infty}^\infty e^{ist} f(t) \,dt $$ $A\propto_+ B$ means "$A$ is proportional to $B$ with positive constant." $f_X(t)$ is the PDF of the random variable $X$. Unless denoted otherwise, $\int=\int_{-\infty}^\infty$. The Mellin transform of $f(t)$ is denoted $\mathcal{M}[f(t)]$.

Proof

Suppose to the contrary that there exists a random variable $Y$ for which $Z$ is normally distributed, that is, for which the convolution $f_Y * f_X=f_Z$ is a Gaussian PDF. Without loss of generality, we may assume that $Z$ has mean 0. Applying the convolution property of the Fourier transform ($f*g\mapsto F\cdot G$) implies that $$ F_Y(s) \cdot F_X(s) \propto_+ e^{- as^2} $$ for some $a>0$. We will show that no function $F_Y$ satisfying this equation can be the Fourier transform of a non-negative function.

$F_X$ can be computed explicitly from this table of Mellin transforms. We have $$ F_X (s)=\int \frac{e^{t(1+is)}}{(e^{t}+1)^2} \,dt $$ The change of variables $u=e^t$ gives $$ F_X(s) =\int_0^\infty \frac{u^{is}}{(u+1)^2}\, du = \mathcal{M}[(t+1)^{-2}](z) $$ where $z=1+is$. The table gives the identiy $$ \mathcal{M}[(t+1)^{-2}](z)=\frac{\Gamma(2-z)\Gamma(z)}{\Gamma(2)} $$ Applying common identities for the Gamma function gives $$ \Gamma(2-z)\Gamma(z) = \frac{\Gamma(2-z)}{\Gamma(1-z)}\cdot \Gamma(1-z)\Gamma(z)=(1-z)\cdot \pi\csc\pi z $$ Putting it all together gives $$ F_X(s)= -i\pi s\cdot \csc(\pi(1+is)) = \frac{2\pi s}{e^{\pi s} - e^{-\pi s}} $$ Define $$ G(s):= 2\pi \frac{e^{-as^2}}{F_X(s)} = e^{-as^2} \left(\frac{e^{\pi s} - e^{-\pi s}}{s}\right) $$

We have shown that $F_Y(s)\propto_+ G(s)$. Since $G$ is an even function, its inverse transform is $$ g(t) = \frac{1}{2\pi} \int G(s) \cos st\, ds $$ We will now show that $g(t)$ cannot be non-negative, the desired result.

Expanding $s^{-1}(e^{\pi s} - e^{-\pi s})$ in a Taylor series gives $$ g(t) = \sum_{n=0}^\infty \frac{\pi^{2n}}{(2n+1)!} \int e^{-as^2} s^{2n} \cos st \, ds $$ Let $H_n$ denotes the $n$th Hermite polynomial, $$ H_n(t):=(-1)^ne^{t^2}\left(\frac{d}{dt}\right)^n e^{-t^2} $$ Equation (3.952.9) of ref. 1 gives $$ \int_0^\infty s^{2n}e^{-b^2s^2} \cos as \, ds = (-1)^n\frac{\sqrt{\pi}}{(2b)^{2n+1}}e^{-a^2/(4b^2)} H_{2n}\left(\frac{a}{2b}\right) $$ With $w:=\frac{\pi}{2\sqrt{a}}$ and $x:=\frac{t}{2\sqrt{a}}$ we obtain $$ g(t)\propto_+ h(w,x):=\sum_{n=0}^\infty \frac{w^{2n+1} (-1)^n}{(2n+1)!} e^{-x^2} H_{2n} (x) $$ Choose $y\in\mathbb{R}$ and define
$$ p(x):=h(w,x)e^{2xy-y^2}= \sum_{n=0}^\infty \frac{w^{2n+1} (-1)^n}{(2n+1)!} e^{-(x-y)^2} H_{2n} (x) $$ Note that $p(x)\in L_1$. This follows from the inequality $|H_{2n}(x)|\leq e^{x^2/2} 2^{2n+1} n!$, which is implied by (22.14.15) of ref. 2. Indeed, $$ I(w,y):=\int p(x)\,dx \leq \int\sum_{n=0}^\infty \frac{|(2w)^{2n+1}|}{(2n+1)!} e^{-(x-y)^2} n! e^{x^2/2}\, dx $$ which converges by the Ratio Test. This inequality also implies that we may apply Fubini's theorem to transpose the sum and integral, which gives
$$ I(w,y)=\sum_{n=0}^\infty \frac{w^{2n+1} (-1)^n}{(2n+1)!} \int e^{-(x-y)^2} H_{2n} (x)\, dx $$

Equation (7.374.6) of ref. 1 gives $$ \int e^{-(x-y)^2} H_n(x)\,dx = \sqrt{\pi} (2y)^n $$ which implies that
$$ I(w,y) = \frac{\sqrt{\pi}}{2y} \sum_{n=0}^\infty \frac{(2yw)^{2n+1}(-1)^n}{(2n+1)!}=\frac{\sqrt{\pi}}{2y} \sin 2yw $$ This implies that for some $y$, $p(x)$, and hence $x\mapsto h(w,x)$ cannot be non-negative, which implies the same of $g(t)$. This completes the proof.

References

  1. I. S. Gradshteyn, I. M. Ryzhik, Table of Integrals and Series, 4th ed., Academic Press, New York, 1965

  2. M. Abramowitz and I. A. Stegun, Handbook of Mathematical Functions, Dover Publications Inc., New York, 1972

3
On

Considering what the main answer here had to go through in order to prove the impossibility, what follows should look suspiciously simple, but I am going to try it anyway, and I trust somebody will spot any gap or mistake. Using probabilities-as-distribution-functions, we have

$$F_Z(z) = P[Z\leq z] = P[X\leq z -Y]=\int_{-\infty}^{\infty}f_Y(y)\int_{-\infty}^{z-y}f_X(t)dtdy$$

$$\implies F_Z(z) = \mathbb E[F_X(z-Y)]$$

$F_X(x)$ is the "standard" Logistic distribution, $F_X(x) = (1+e^{-x})^{-1}$, while we want $F_Z(z)$ to be a normal random variable (with zero mean, without loss of generality). So we want, using the usual notation

$$\Phi(z/\sigma) = \mathbb E\left[\frac{1}{1+e^{-z+Y}}\right]= \mathbb E\left[\frac{e^z}{e^z+e^{Y}}\right]$$

Differentiating with respect to $z$ to obtain the density, we require

$$\frac 1{\sigma\sqrt{2\pi}}\cdot \exp\{-0.5z^2/\sigma^2\} = \mathbb E\left[\frac{e^ze^Y}{(e^z+e^{Y})^2}\right]$$

$$\implies \sigma\sqrt{2\pi}\cdot \exp\{0.5z^2/\sigma^2\}\cdot e^z\cdot \mathbb E\left[\frac{e^Y}{(e^z+e^{Y})^2}\right] =1$$

and this should hold for all $z \in \mathbb R$.
It looks like it doesn't, as $z$ enters the negative numbers and moves towards the left end of the real line. The expected value will remain finite, and the rest will increase without bound. This appears to prove that we cannot find an r.v. $Y$, such that, if added to an independent r.v. that follows the standard logistic distribution, will give us a normal random variable.