Show that $\int_{\mathbb{R}^n}f_1f_2 ...f_n dx_1 ...dx_n ≤ (I_1 ...I_n)^{1/(n−1)}.$

425 Views Asked by At

For $\quad k = 1,2,...n,\quad$ let $\quad\mathbb{R}^k = \mathbb{R},\quad f_k(x_1,...,x_{k−1},x_{k+1},\ldots,x_n)\quad$ be a nonnegative measurable function on $\quad\mathbb{R}_1\times\ldots\times \mathbb{R}_{k−1}\times\mathbb{R}_{k+1}\times\ldots\times\mathbb{R}_n.$ \begin{align} &\\ \mbox{Let}\quad I_k &= \int_{\mathbb{R}^{n - 1}}\ f^{n − 1}_k\,{\rm d}x_1\ldots{\rm d}x_{k − 1}\,{\rm d}x_{k+1}\ldots{\rm d}x_n\,, \qquad k = 1,2,\ldots,n \\[3mm] \mbox{Show that}\quad& \int_{\mathbb{R}^{n}}\ f_1\,f_2\,\ldots\,f_n\,{\rm d}x_1\ldots{\rm d}x_n\ \leq\ \left(I_1\ldots I_n\right)^{1/\left(n − 1\right)} \\& \end{align}

Also, show

  1. Let $V$ be a bounded closed domain in $\mathbb{R}^3, S_1, S_2$ and $S_3$ be the areas of the projections of $V$ onto the three coordinate planes respectively. Show that $m\left(\,V\,\right) \leq \left(\,S_1S_2S_3\,\right)^{1/2}$

I really am not sure what the trick is. I tried using holder, but this gives weird powers in wrong places. Obviously we need to use Fubini somewhere and of course it is obvious for the case $n=2$. But I even tried the case $n=3$ to no avail and I believe that case may be the key to the general method. Any help would be great. Thanks.

2

There are 2 best solutions below

0
On

Hölder may not be delicate enough to handle the inequality you describe. The Loomis-Whitney Inequality may suffice - but then you have to prove Loomis-Whitney.

Here is some outline from a similar question I answered: Why is Volume^2 at most product of the 3 projections? This will answer your question in $\mathbb{R}^3$.


Here is the proof taken from the question I linked. Partition your object into small cubes - this requires something like outer measure or Lebesgue measure. Let $X_i$ be the distribution of the $i$-th coordinate of these cubes:

$$ 2\log \mathrm{Vol}(P) = 2 H(X_1, X_2, X_3) \leq H(X_1, X_2) + H(X_2,X_3) + H(X_3, X_1) = \sum_{i=1}^3 \log \mathrm{Area}(P_i)$$

In fact the Entropy inequality is more general. Let $h(X_S) = h(X_s: s \in S)$. One finds that...

$$ h(k,n) = \sum_{|S|=n-k} \frac{1}{k} \binom{n}{k}^{-1} h(X_S)$$

is an increasing function in $k$. In particular $k=0, k=1$.


For the uniform distribution on a set, the entropy of a single random variable is the log of the volume of the set of possible values of the coordinate: $H(X) = \log |\{ X(x): x \in P\} |$.

The uniform distribution supported on a set $P$ will have constant density $\mu(x) = \frac{1}{\mu(P)}$. Then use the formula for the differential entropy:

$$ h(X) = - \int_P d\mu \log \mu = - \log \frac{1}{\mu(P)} \int_P d\mu = \log \mu(P) $$


Or see this nice article Hypergraphs, Entropy and Inequalities by Ehud Friedgut

0
On

A discrete variant, where we consider a finite subset of the 3-dimensional space, was asked at the IMO in 1992. See http://www.artofproblemsolving.com/Forum/viewtopic.php?t=60719. The IMO problem gives us some hint how prove it in any dimension, and the proofs can be extended to functions or measures.

For the inductive proof of the discrete problem, the key is to find the case of equality. Obviously, equality holds if the set is of the form $A\times B\times C$ with finite sets $A,B,C\subset\mathbb{R}$. For the induction step, you can split one of the sets in two smaller subsets, apply the induction hypotheses and then apply Cauchy or Hölder to complete.

(I have seen this continuous form too, but could not find it.)


An inductive solution, using Hölder is this. For $n=2$ the statement is trivial: $$ \int_x \int_y f(y) f(x) = \left(\int_y f(y)\right) \cdot \left(\int_x g(x)\right) = I_1I_2. $$

Now suppose that $n\ge 3$ and the statement holds true for $n-1$. Apply Hölder twice as \begin{gather*} \int_{x_1}\ldots\int_{x_n} f_1\ldots f_n = \int_{x_1}\ldots\int_{x_{n-1}} \left( f_n \int_{x_n} f_1\dots f_{n-1} \right) \le \\ \le \left(\int_{x_1}\ldots\int_{x_{n-1}} f_n^{n-1} \right)^{\frac1{n-1}} \left(\int_{x_1}\ldots\int_{x_{n-1}} \left(\int_{x_n} f_1\dots f_{n-1}\right)^{\frac{n-1}{n-2}}\right)^{\frac{n-2}{n-1}} \le \\ \le I_n^{\frac1{n-1}} \left(\int_{x_1}\ldots\int_{x_{n-1}} \prod_{i=1}^{n-1} \left(\int_{x_n} f_i^{n-1}\right)^{\frac1{n-2}}\right)^{\frac{n-2}{n-1}}. \end{gather*} Applying the induction hypothesis to the functions $F_i(x_1,\ldots,x_{i-1},x_{i+1},\ldots,x_{n-1})=\left(\int_{x_n}f_i^{n-1}(x_1,\ldots,x_{i-1},x_{i+1},\ldots,x_{n-1},x_n)\mathrm{d}x_n\right)^{\frac1{n-2}}$, we get \begin{gather*} \int_{x_1}\ldots\int_{x_{n-1}} \prod_{i=1}^{n-1} \left(\int_{x_n} f_i^{n-1}\right)^{\frac1{n-2}} = \int_{x_1}\ldots\int_{x_{n-1}} F_1\ldots F_{n-1} \le\\\le \prod_{i=1}^{n-1} \left( \int_{x_1}\ldots\int_{x_{i-1}}\int_{x_{i+1}}\ldots\int_{x_{n-1}} F_i^{n-2} \right)^{\frac1{n-2}} = \prod_{i=1}^{n-1} \left( \int_{x_1}\ldots\int_{x_{i-1}}\int_{x_{i+1}}\ldots\int_{x_n} f_i^{n-1} \right)^{\frac1{n-2}} =\\= (I_1\ldots I_{n-1})^{\frac1{n-2}}. \end{gather*} Hence, \begin{gather*} \int_{x_1}\ldots\int_{x_n} f_1\ldots f_n \le I_n^{\frac1{n-1}} \left(\int_{x_1}\ldots\int_{x_{n-1}} \prod_{i=1}^{n-1} \left(\int_{x_n} f_i^{n-1}\right)^{\frac1{n-2}}\right)^{\frac{n-2}{n-1}} \le (I_1\ldots I_n)^{\frac1{n-1}}. \end{gather*} Done.