I'm working on a problem involving two linear unbiased estimators $T$ and $T'$ of a parameter $\theta$, defined from a sample $\{X_1, \dots, X_n\}$ with mean $\theta$ and finite variance. I aim to prove that $\text{Cov}_\theta(T, T') = \text{Var}_\theta(T)$, given that:
- $T = \sum_{i=1}^{n} \alpha_i X_i$ and $T'$ is another linear unbiased estimator of $\theta$, with $T' = \sum_{i=1}^{n} \beta_i X_i$.
- Both $T$ and $T'$ are unbiased ($\mathbb{E}[T] = \mathbb{E}[T'] = \theta$).
- $T$ has minimum variance among all such estimators.
Through my derivations, I reached that $\text{Cov}_\theta(T, T') = \sigma^2 \sum_{i=1}^{n} \alpha_i \beta_i$ and $\text{Var}_\theta(T) = \sigma^2 \sum_{i=1}^{n} \alpha_i^2$. From these, I inferred that if $\text{Cov}_\theta(T, T') = \text{Var}_\theta(T)$, it should imply $\sum_{i=1}^{n} \alpha_i \beta_i = \sum_{i=1}^{n} \alpha_i^2$, but it seems hard to prove, and I am honestly not even sure if this is a solvable equation. I would need to prove this in order to conclude my proof.
I'm unsure if what I got to is universally valid or if I've overlooked critical assumptions about the relationship between $T$ and $T'$. Here's a summary of my steps:
- Established the unbiasedness of $T$ and $T'$.
- Calculated $\text{Cov}_\theta(T, T')$ using the expectation of their product minus the product of their expectations.
- Arrived at the conclusion based on the expressions for covariance and variance.
Is there a flaw in my reasoning, or are there specific conditions under which this relationship holds true? I'm particularly interested in understanding the assumptions required for $\text{Cov}_\theta(T, T') = \text{Var}_\theta(T)$ to be valid. Any insights or references to similar proofs would be greatly appreciated!