When said that Gaussian distribution is determined by it's mean and variance. How is that different of other distributions? Almost every distribution which I can think of has this property. For example if we know the mean of exponential, Poisson distribution then we know the whole distributions.
2026-04-28 13:47:49.1777384069
Gaussian distribution determined by first two moments
2.2k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in NORMAL-DISTRIBUTION
- Expectation involving bivariate standard normal distribution
- How to get a joint distribution from two conditional distributions?
- Identity related to Brownian motion
- What's the distribution of a noncentral chi squared variable plus a constant?
- Show joint cdf is continuous
- Gamma distribution to normal approximation
- How to derive $E(XX^T)$?
- $\{ X_{i} \}_{i=1}^{n} \thicksim iid N(\theta, 1)$. What is distribution of $X_{2} - X_{1}$?
- Lindeberg condition fails, but a CLT still applies
- Estimating a normal distribution
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I agree with the comment by @eigenchris for 'well-known' distributions encountered early on in a probability course. However, one does not have to venture too far into the study of probability distributions to find examples in which knowing the population mean and variance does not easily specify the distribution.
It is useful to make a distinction between the parameters of the distribution of a random variable $X$ and other quantities such as $\mu =E(X), \sigma^2=V(X),$ and $E(X^2),$ (sometimes called 'moments') which in some sense might be said to 'determine' the distribution. Here are a few examples:
UNIFORM: If $X \sim Unif(\alpha_1, \alpha_2)$, then the endpoints $\alpha_1$ and $\alpha_2$ of the support interval are usually taken as the parameters of the distribution. However, if specified, the mean $\mu = E(X) = (\alpha_1 + \alpha_2)/2$ and variance $\sigma^2 = V(X) = (\alpha_2 - \alpha_1)^2/12$ could be used to find the parameters $\alpha_1$ and $\alpha_2$ in terms of $\mu$ and $\sigma^2.$
GAMMA: If $X \sim Gamma(\alpha, \theta$), then $\alpha$ is the shape parameter and $\theta$ is the scale parameter. Again, if $\mu = \alpha\theta$ and $\sigma^2 = \alpha\theta^2$ are known, then we could easily solve to find the parameters $\alpha$ and $\theta$ in terms of $\mu$ and $\sigma^2.$
In both uniform and gamma distributions, it is usually more natural or intuitive to think in terms of the parameters, even though they are straightforwardly determined by $\mu$ and $\sigma^2.$
In some other families of distributions, the relationship between moments $\mu$ and $\sigma^2$ and the more natural parameters is not expressed so transparently.
BETA. This family of distributions has two parameters $\alpha$ and $\beta.$ These distributions have support $(0, 1)$. Very roughly speaking $\alpha$ controls the 'shape' of the distribution near $0$ and $\beta$ controls shape near $1$. Here $\mu = \alpha/(\alpha + \beta)$ and $\sigma^2 = \frac{\alpha\beta}{(\alpha + \beta)^2(\alpha + \beta + 1)}.$ It is possible, but not really easy or intuitive to use the moments to determine the parameters.
WEIBULL. This family of distributions has a shape parameter $\kappa$ and a scale parameter $\lambda.$ It is often used in reliability theory and economics. Here $\mu = \lambda \Gamma(1 + 1/\kappa),$ where $\Gamma$ is the gamma function; $\sigma^2$ is expressed in terms of a somewhat more complex formula involving two $\Gamma$ functions and the parameters (see Wikipedia).
In applications of these last two families of distributions, it is much more natural to think in terms of the parameters than in terms of the moments, even though numerical methods can be used to find the parameters if specific values of the moments are given.