Given a sample $\textbf{x} = (x_1, \ldots, x_n)$, define $\textbf{x}_{(-i)}$ as the sample values excluding sample $x_i$. That is, $$ \textbf{x}_{(-i)} = (x_1, \ldots, x_{i-1}, x_{i+1}, \ldots x_n). $$ Now given estimator $T(\textbf{x})$ of parameter $\tau(\theta)$ based on the sample $\textbf{x}$, Lehmann and Casella (in the 2nd edition of Theory of Point Estimation) define the jackknifed version of $T(\textbf{x})$ as $$ T_J(\textbf{x}) = nT(\textbf{x}) - \frac{n -1}{n}\sum_{i=1}^n T\left(\textbf{x}_{(-i)}\right). $$ They claim that if $E[T(\textbf{x})] = \tau(\theta) + O\left(\frac{1}{n}\right)$, then $E[T_J(\textbf{x})] = \tau(\theta) + O\left(\frac{1}{n^2}\right)$. Can someone explain why the jackknifed version of $T(\textbf{x})$ has lower asymptotic bias then the original estimator?
2026-03-30 03:58:31.1774843111
Why does the jackknife reduce bias
71 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in PARAMETER-ESTIMATION
- Question on completeness of sufficient statistic.
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Estimating the mean of a Poisson distribution
- A problem on Maximum likelihood estimator of $\theta$
- The Linear Regression model is computed well only with uncorrelated variables
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Is there an intuitive way to see that $\mathbb{E}[X|Y]$ is the least squares estimator of $X$ given $Y$?
- Consistent estimator for Poisson distribution
- estimation of $\mu$ in a Gaussian with set confidence interval
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I figured this out, but it requires a constraint slightly stronger than the original problem statement. In the original statement, they claim the result is true if $E[T(\textbf{x})] = \tau(\theta) + O\left(\frac{1}{n}\right)$. I was getting hung up on pathological cases, like: $$ E[T(\textbf{x})] = \tau(\theta) + \frac{1}{n} + \frac{1}{n^{1.5}}, $$ and I don't believe the result is actually true in such cases. If you assume the slightly stronger constraint $$ E[T(\textbf{x})] = \tau(\theta) + \sum_{i=1}^\infty\frac{b_i}{n^i}, $$ the result becomes provable. Note that in many practical cases, the estimator will be representable in this constrained form of $O\left(\frac{1}{n}\right)$ functions, so this extra constraint does not dillute the power of the jackknife method in reducing bias. Here's the proof: $$ \begin{align} E[T_J(\textbf{x})] &= E\left[nT(\textbf{x}) - \frac{n -1}{n}\sum_{j=1}^n T\left(\textbf{x}_{(-j)}\right)\right]\\ &= nE[T(\textbf{x})] - \frac{n -1}{n}\sum_{j=1}^n E\left[T\left(\textbf{x}_{(-j)}\right)\right] \\ &= n\left(\tau(\theta) + \sum_{i=1}^\infty\frac{b_i}{n^i} \right) - \frac{n -1}{n}\left(\sum_{j=1}^n \left(\tau(\theta) + \sum_{i=1}^\infty \frac{b_i}{(n-1)^i}\right)\right) \\ &=n\tau(\theta) + \sum_{i=1}^\infty\frac{b_i}{n^{i-1}} - (n-1)\left(\tau(\theta) + \sum_{i=1}^\infty \frac{b_i}{(n-1)^i}\right) \\ &= \tau(\theta) + \sum_{i=1}^\infty \left(\frac{b_i}{n^{i-1}} - \frac{b_i}{(n-1)^{i-1}}\right) \\ &= \tau(\theta) + \left(b_1 - b_1\right) + \sum_{i=2}^\infty \left(\frac{b_i}{n^{i-1}} - \frac{b_i}{(n-1)^{i-1}}\right) \\ &= \tau(\theta)+ \sum_{i=2}^\infty b_i\frac{(n-1)^{i-1} - n^{i-1}}{n^{i-1}(n-1)^{i-1}} \end{align} $$ Letting $$ c_i = \frac{(n-1)^{i-1} - n^{i-1}}{n^{i-1}(n-1)^{i-1}}, $$ it's not hard to see from a binomial expansion in the numerator that $c_i = O\left(\frac{1}{n^i}\right)$, so $$ E[T_J(\textbf{x})] = \tau(\theta) + O\left(\frac{1}{n^2}\right) $$