There is a function $f : [0;1]^n\rightarrow \mathbb{R}$ which is Lipschitzian and strictly decreasing in all variables. Is it possible to prove either one of these two statements?
There is an $M > 0$ such that for all $a,b$ such that $a < b$ $$ \lambda (\{X \in [0;1]^n \mid f(X) \in [a;b] \}) < M(b-a) $$
Where $\lambda$ is the measure function.
or
If we have a independent uniformly distributed random variables: $X_1, \dots , X_n$ then the probability density function $f_d$ of the random variable $f (X_1, \dots , X_n)$ is bounded.
My thoughts sofar:
I know that there can not be any $c \in \mathbb{R}$ for which the probability $P[f (X_1, \dots , X_n) = c] > 0$ since this would imply that there is a nondegenerate hyperbox that is mapped to $c$ by $f$. $f$ is constant on this box and this would be contradiction with the monotonicity assumption. This gives me the intuition that the statements are true.
However I do not know how to make the transition to the proof.
I think it is not true...
Let $f(x)=(1-x)^2$. Then, for $b \in [0,1]$:
$$\lambda({x \in [0,1] | f(x) \in [0,b]}) = \lambda({x \in [0,1] | x \in [1-\sqrt b,1]}) = \sqrt b$$
So, do we have $M$ such that, for all $b \in [0,1]$, $\sqrt b < M b$, ie $M>\frac{1}{\sqrt b}$? No.