I know when the mean and variance of $\ln x$ are both fixed, then the maximum entropy probability distribution is lognormal. When the mean of a random variable is fixed the MEPD is the exponential distribution. My question is, what is the MEPD in the continuous case when neither the mean or variance are fixed with support on $[0, \infty)$?
2026-04-01 15:46:03.1775058363
Maximum Entropy Distribution When Mean and Variance are Not Fixed with Positive Support
1.3k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in BAYESIAN
- Obtain the conditional distributions from the full posterior distribution
- What it the posterior distribution $\mu| \sigma^2,x $
- Posterior: normal likelihood, uniform prior?
- If there are two siblings and you meet one of them and he is male, what is the probability that the other sibling is also male?
- Aggregating information and bayesian information
- Bayesian updating - likelihood
- Is my derivation for the maximum likelihood estimation for naive bayes correct?
- I don't understand where does the $\frac{k-1}{k}$ factor come from, in the probability mass function derived by Bayesian approach.
- How to interpret this bayesian inference formula
- How to prove inadmissibility of a decision rule?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
In the discrete case you need to consider the functional
$$H[p]=-\sum_{i=1}^n p_i \ln(p_i)+\lambda(\sum_{i=1}^n p_i-1)$$
as we consider a single constraint.
Setting $\frac{\partial H[p]}{\partial p_i}=0$ for all $i=1,\dots,n$ we arrive at
$$-\ln(p_i)-1+\lambda=0\Leftrightarrow p_i=e^{\lambda-1}.$$
Imposing $\sum_{i=1}^n p_i-1=0$, one gets
$\lambda=1-\ln(n)$, or $p_i=e^{1-\ln(n)-1}=\frac{1}{n}$.
In summary, the wished distribution is the uniform probability distribution.
The continuous case needs more care, due to non trivial integration range. We want to maximize the functional
$$H[p]=-\int_{0}^{\infty}p(x)\ln(p(x))dx+\lambda(\int_{0}^{\infty}p(x)dx-1), $$
where $p$ has support $[0,\infty]$ and $p(0)=p(\infty)=0$. We apply the calculus of variations by considering any distribution $\phi$ s.t. $p(0)=p(\infty)=\phi(0)=\phi(\infty)$. We compute the variation
$$\frac{\delta H}{\delta\phi}|_{p}=\lim_{\epsilon\rightarrow 0} \frac{H[p+\epsilon\phi]-H[p]}{\epsilon}=\lim_{\epsilon\rightarrow 0}\frac{1}{\epsilon}\left[\int_{0}^{\infty}\left(F(p+\epsilon\phi,x)-F(p,x)\right)dx+ \lambda(\int_{0}^{\infty}\epsilon\phi dx)\right],$$
where $F(p,x)=-p(x)\ln(p(x))$ and $F(p+\epsilon\phi,x)=-(p(x)+\epsilon\phi)\ln(p(x)+\epsilon\phi)$.
Using
$$F(p+\epsilon\phi,x)-F(p,x)=\epsilon\phi\frac{\partial F}{\partial p}(p,x)+O(\epsilon^2)$$
we have
$$\frac{dH}{d\phi}|_{p}=\int_{0}^{\infty}\left(\frac{\partial F}{\partial p}(p,x)+\lambda\right)\phi dx$$
where $\frac{\partial F}{\partial p}(p,x)=-\ln(p(x))-1$. In summary
$$-\ln(p(x))-1+\lambda=0 $$
or $p(x)=e^{\lambda-1}$, with $\int_0^{\infty}e^{\lambda-1}dx=1,$ which is not possible.
Roughly speaking, the absence of additional constraints like the fixed mean one
$$\int_{0}^{\infty} xp(x)dx=\mu$$
does not allow to arrive at "more interesting" differential equations for $p(x)$. Note that the $F$ does not depend on $p'(x)$: this leads to the simplified Euler Lagrange equation
$$\frac{\partial F}{\partial p}(p,x)+\lambda=0.$$