For $\quad k = 1,2,...n,\quad$ let $\quad\mathbb{R}^k = \mathbb{R},\quad f_k(x_1,...,x_{k−1},x_{k+1},\ldots,x_n)\quad$ be a nonnegative measurable function on $\quad\mathbb{R}_1\times\ldots\times \mathbb{R}_{k−1}\times\mathbb{R}_{k+1}\times\ldots\times\mathbb{R}_n.$ \begin{align} &\\ \mbox{Let}\quad I_k &= \int_{\mathbb{R}^{n - 1}}\ f^{n − 1}_k\,{\rm d}x_1\ldots{\rm d}x_{k − 1}\,{\rm d}x_{k+1}\ldots{\rm d}x_n\,, \qquad k = 1,2,\ldots,n \\[3mm] \mbox{Show that}\quad& \int_{\mathbb{R}^{n}}\ f_1\,f_2\,\ldots\,f_n\,{\rm d}x_1\ldots{\rm d}x_n\ \leq\ \left(I_1\ldots I_n\right)^{1/\left(n − 1\right)} \\& \end{align}
Also, show
- Let $V$ be a bounded closed domain in $\mathbb{R}^3, S_1, S_2$ and $S_3$ be the areas of the projections of $V$ onto the three coordinate planes respectively. Show that $m\left(\,V\,\right) \leq \left(\,S_1S_2S_3\,\right)^{1/2}$
I really am not sure what the trick is. I tried using holder, but this gives weird powers in wrong places. Obviously we need to use Fubini somewhere and of course it is obvious for the case $n=2$. But I even tried the case $n=3$ to no avail and I believe that case may be the key to the general method. Any help would be great. Thanks.
Hölder may not be delicate enough to handle the inequality you describe. The Loomis-Whitney Inequality may suffice - but then you have to prove Loomis-Whitney.
Here is some outline from a similar question I answered: Why is Volume^2 at most product of the 3 projections? This will answer your question in $\mathbb{R}^3$.
Here is the proof taken from the question I linked. Partition your object into small cubes - this requires something like outer measure or Lebesgue measure. Let $X_i$ be the distribution of the $i$-th coordinate of these cubes:
$$ 2\log \mathrm{Vol}(P) = 2 H(X_1, X_2, X_3) \leq H(X_1, X_2) + H(X_2,X_3) + H(X_3, X_1) = \sum_{i=1}^3 \log \mathrm{Area}(P_i)$$
In fact the Entropy inequality is more general. Let $h(X_S) = h(X_s: s \in S)$. One finds that...
$$ h(k,n) = \sum_{|S|=n-k} \frac{1}{k} \binom{n}{k}^{-1} h(X_S)$$
is an increasing function in $k$. In particular $k=0, k=1$.
For the uniform distribution on a set, the entropy of a single random variable is the log of the volume of the set of possible values of the coordinate: $H(X) = \log |\{ X(x): x \in P\} |$.
The uniform distribution supported on a set $P$ will have constant density $\mu(x) = \frac{1}{\mu(P)}$. Then use the formula for the differential entropy:
$$ h(X) = - \int_P d\mu \log \mu = - \log \frac{1}{\mu(P)} \int_P d\mu = \log \mu(P) $$
Or see this nice article Hypergraphs, Entropy and Inequalities by Ehud Friedgut