Is there a way to find the maximum entropy distribution with certain values for the three first moments, when the support is the set of real numbers? Or, without loss of generality, with mean zero, unit variance and skewness $\gamma$? How can I do that?
2026-03-26 19:08:08.1774552088
Maximum entropy distribution with given mean, variance and skewness?
135 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in ENTROPY
- Relation between Shanon entropy via relation of probabilities
- How to maximise the difference between entropy and expected length of an Huffman code?
- Appoximation of Multiplicity
- Two questions about limits (in an exercise about the axiomatic definition of entropy)
- Computing entropy from joint probability table
- Joint differential entropy of sum of random variables: $h(X,X+Y)=h(X,Y)$?
- What is the least prime which has 32 1-bits?
- Eggs, buildings and entropy
- Markov chains, entropy and mutual information
- Entropy and Maximum Mutual Information
Related Questions in MAXIMUM-PRINCIPLE
- How does this follows from the maximum principle?
- Maximum principle and a differential inequality
- Can anyone tell me which book is cited?
- Why can we apply the strong maximum principle?
- $|f(z)| + \ln|z| \le 0$, is $f = 0$?
- Proof of extremum principle, real analysis
- Optimal bound on a problem similar to Schwarz lemma
- Weak maximum principle - Schrödinger operator
- What is the name of this "Hopf's theorem"?
- Maximum Principle for Elliptic PDE
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
When the support $S=\mathbb{R}$, there is no maximum entropy distribution with mean $\mu\in\mathbb{R}$, variance $\sigma^2\in\mathbb{R}_{>0}$ and skewness $\gamma\in\mathbb{R}_{\neq 0}$. You can always find distributions that satisfy those constraints, but the set of different entropies that they can take on is open, and therefore doesn't have a maximum value.
For $S=[-a, +a]$ and for some sufficiently large $a\in\mathbb{R}$, there is a maximum entropy distribution $\mathcal{P}(a,\mu,\sigma^2,\gamma)$ with mean $\mu$, variance $\sigma^2$ and skewness $\gamma$, and it will have a probability density function on the form $f(x) = Z^{-1}e^{\lambda_1 x^1 + \lambda_2 x^2 + \lambda_3 x^3}$. Assuming that $\gamma\neq 0$, this distribution will have a strictly smaller entropy than the maximum entropy distribution with the same mean and variance but with zero skewness, which is $\mathcal{N}(\mu,\sigma^2)$, and for $a - |\mu| \gg \sigma$, $\mathcal{P}(a,\mu,\sigma^2,\gamma)$ will look practically like $\mathcal{N}(\mu,\sigma^2)$, with a small amount of its probability mass taken and pressed against one of the interval endpoints. And in the limit as $a\to\infty$, $\mathcal{P}(a,\mu,\sigma^2,\gamma)$ will approach $\mathcal{N}(\mu,\sigma^2)$ and therefore its entropy will approach that of $\mathcal{N}(\mu,\sigma^2)$.
A probability distribution with a small amount of its probability mass taken and moved far away from the distribution center in order to give rise to a skewness was not was I had in mind, though. (Bad distribution!) Considering the fact that even though the skewness of this distribution has the value that was specified, but higher order moments will approach either $\infty$ or $-\infty$ as $a\to\infty$, it becomes apparent that to get something that is more reasonable, some of the higher order moments need to be constrained as well. Constraining the kurtosis (the fourth standardized moment) to a maximum value makes the problem solvable when $S=\mathbb{R}$ and introduced an $x^4$ term with a negative coefficient in the exponent, which keeps the probability density function bounded and makes all higher order moments well defined as well.