If the Estimator was simply the sample mean $s=\frac{\sum{x}}{n}$ taken from a binomial distribution (a random example) how would i calculate the variance of this? I am trying to use the difference between the expectations squared but im not sure what the expectation of the infinite sum would be.
2026-03-28 13:11:17.1774703477
how do i find the variance of an estimator?
3.4k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in VARIANCE
- Proof that $\mathrm{Var}\bigg(\frac{1}{n} \sum_{i=1}^nY_i\bigg) = \frac{1}{n}\mathrm{Var}(Y_1)$
- $\{ X_{i} \}_{i=1}^{n} \thicksim iid N(\theta, 1)$. What is distribution of $X_{2} - X_{1}$?
- Reason generalized linear model
- Variance of $\mathrm{Proj}_{\mathcal{R}(A^T)}(z)$ for $z \sim \mathcal{N}(0, I_m)$.
- Variance of a set of quaternions?
- Is the usage of unbiased estimator appropriate?
- Stochastic proof variance
- Bit of help gaining intuition about conditional expectation and variance
- Variance of $T_n = \min_i \{ X_i \} + \max_i \{ X_i \}$
- Compute the variance of $S = \sum\limits_{i = 1}^N X_i$, what did I do wrong?
Related Questions in PARAMETER-ESTIMATION
- Question on completeness of sufficient statistic.
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Estimating the mean of a Poisson distribution
- A problem on Maximum likelihood estimator of $\theta$
- The Linear Regression model is computed well only with uncorrelated variables
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Is there an intuitive way to see that $\mathbb{E}[X|Y]$ is the least squares estimator of $X$ given $Y$?
- Consistent estimator for Poisson distribution
- estimation of $\mu$ in a Gaussian with set confidence interval
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
If $X_1, X_2, \dots, X_n$ is a random sample from a population with mean $\mu$ and variance $\sigma^2,$ let $T = \sum_{i=1}^n X_i.$
Then
$$E(T) = E\left(\sum_{i=1}^n X_i\right) = \sum_{i=1}^n E(X_i) = \sum_{i=1}^n \mu = n\mu.$$
Also, elements of a random sample are independent, so we have
$$V(T) = V\left(\sum_{i=1}^n X_i\right) = \sum_{i=1}^n V(X_i) = \sum_{i=1}^n \sigma^2 = n\sigma^2.$$
Also, with $\bar X = \frac{1}{n}\sum_{i=1}^n X_i = \frac{1}{n}T,$ so that
$$E(\bar X) = E\left(\frac{1}{n}T\right) = \frac{1}{n}E(T) = \frac{1}{n}n\mu = \mu.$$
Thus. the expected value of the sample mean $\bar X$ is the population mean $\mu.$ (We say that $\bar X$ is an unbiased estimator of $\mu.)$
Moreover,
$$V(\bar X) = V\left(\frac{1}{n}T\right) = \left(\frac{1}{n}\right)^2V(T) = \left(\frac{1}{n}\right)^2n\sigma^2 = \frac{1}{n}\sigma^2 = \sigma^2/n.$$
Notes: (1) In the first displayed equation the expected value of a sum of random variables is the sum of the expected values, whether nor not the random variables are independent.
(2) However, the variance of the sum of random variables is not necessarily equal to the sum of the variances, unless the random variables are independent.
[As a trivial case, if all $n \ge 2$ of the $X_i = X,$ then the $X_i$ are not independent and we have $V\left(\sum_{i=1}^n X_i\right) = V(nX) = n^2V(X) \ne nV(X).$ As another example, if $X_1 = -X_2$ with $V(X_1)=V(X_2) > 0,$ then $V(X_1+X_2) = V(0) = 0 \ne V(X_1)+V(X_2).]$
(3) For the standard deviation of the mean of a random sample, we can take square roots to get, $SD(\bar X) = \sigma/\sqrt{n}.$ (Sometimes this is called the 'standard error' of $\bar X.)$