I am confused between mean and variance of a statistical data and the probability distribution. Are both of them different to each-other? as in its simple form, the mean is given in terms of the sum of variables and its frequency divided by total no. of frequency. But when it comes to probability distribution, the mean is computed in totally different way and there is no concept of frequency etc. If both mean represents the same thing then can we conclude that the statistical variable is replaced by random variable and its frequency is substituted by the probability of accurance of the random variable ?
2026-04-02 18:47:42.1775155662
Mean and variance of a probability distribution.
55 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in MEANS
- Arithmetic and harmonic mean of two numbers.
- Mean and variance of $X:=(k-3)^2$ for $k\in\{1,\ldots,6\}$.
- Reason generalized linear model
- How do you calculate the probability of the difference between two normal distribution
- Calculating standard deviation without a data set.
- Compute the variance of $S = \sum\limits_{i = 1}^N X_i$, what did I do wrong?
- Find out if $\hat{\tau}$ is an unbiased estimator
- Computing mean and variance of custom distribution
- Prove $\lim\limits_{n \to \infty} \frac{\log (n!)}{n \sqrt[n]{\log 2 \cdot \log 3 \cdots \log n}}=1$
- How to tell when a data series is a normal distribution
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I am not sure if I will reply correctly to your question, but here is an explanation:
You have two different things, first, for a random variable $X$ taking value in a set $\chi$ that follows a distribution $P$, you can compute $\mathbb{E}[X]$, the expected value of X, that is given by the followings equations regarding that $P$ is discrete (1) or continuous (2): $$ \mathbb{E}[X] = \sum_{x \in \chi} xP(X=x) \ (1), $$
$$ \mathbb{E}[X] = \int_{ \chi} xp(x)dx \ (2), $$ where $p$ is the density function of $X$ if $P$ is continuous. Those values represent the value that you will observe on average if you do a lot observations of $X$. This means that for a $n\in \mathbb{N}$ sufficiently large such that $X_1,\cdots, X_n$ are observations of $X$, you would have $\sum_{i=1}^n X_i/n = \mathbb{E}[X]$.
So, from it, it seems natural to define $\hat{\mathbb{E}[X]}$ the estimator of $\mathbb{E}[X]$ as follows: $$ \hat{\mathbb{E}[X]} = \sum_{i=1}^n X_i/n. $$
You can have a look on this wikipedia page under the topic Mean of a probability distribution. (https://en.wikipedia.org/wiki/Mean#Mean_of_a_probability_distribution).
Another reason that comforts us in the idea that $\hat{\mathbb{E}[X]}$ is a good estimator of the mean is the law of large numbers, see the following link: https://en.wikipedia.org/wiki/Law_of_large_numbers .