I have the following equality about some function $F$, where $n$ is any positive integer:
$$ \int_0^\infty [1 - F(x)]^n \mathrm{d}x = \frac{1}{n} \int_0^\infty [1 - F(x)] \mathrm{d}x $$
In this context, $F$ is defined for $x \geq 0$ and is the cdf of a random variable; that is, $0 \leq F(x) \leq 1$ and $a < b \implies F(a) < F(b)$.
I think that the only $F$ that satisfies the above equality is the cdf of the exponential distribution, $F(x) = 1 - e^{-\lambda x}$ where $\lambda > 0$. I'm having a hard time showing this, though. Does anyone have any pointers?
EDIT: as pointed out by a comment, this is equivalent to showing that if $\int_0^\infty G(x)^n \mathrm{d}x = \frac{1}{n} \int_0^\infty G(x) \mathrm{d}x$ then $G(x)$ must be of the form $e^{-\lambda x}$.
Following Cameron Williams's line of thought.
Let $G(x)=1-F(x)$; the function $G:[0,\infty)\to[0,1]$ is decreasing, and (for the problem to be interesting) we may assume $\int_0^\infty G(x)dx<\infty$.
Define the probability measure $\gamma$ on $[0,\infty)$ by $\gamma(A)=\int_A G(x)dx/\int_0^\infty G(x)dx$. Then the condition reduces to $\mathbb E[ T^{n-1}] = 1/n$ for all $n$, where $T$ is the random variable $G(X)$ when $X$ is picked according to $\gamma$, so that $P(T\le t)=P(G(X)\le t)=\gamma(\{x:G(x)\le t\})$.
On the one hand, by what's known about the Hausdorff Moment Problem, we know that the distribution of $T$ is uniform on $[0,1]$, that is $P(T\le t)=t$ for $t\in[0,1]$. (Uniform because $\int_0^1 x^ndx=1/(n+1)$; uniquely that by the general theory of the HMP.)
On the other hand, we can write $$ P(T\le t)=\int_{\{x: G(x)\le t\}} G(x)dx \tag{*}.$$ Putting the two together we have $$t=\int_{\{x: G(x)\le t\}} G(x)dx$$ (from which one can check that $G$ is continuous) and then $$ G(u)=\int_u^\infty G(x)dx,$$ which gives rise to the differential equation.