Lower bound of the expectation of the squared difference of a function of random variables

118 Views Asked by At

Suppose we have a vector of centered random variables $X=(X_1,\dots,X_p)^T$ with at $p\times p$ covariance matrix $\Sigma$.

Let $\beta =(\beta_1,\dots, \beta_p)^T$ be a vector of coefficients and let $f:\mathbb{R}\to\mathbb{R}$ be an invertible (monotone) and smooth function with bounded derivatives.

Suppose for all other coefficient vectors $\hat{\beta} = (\hat{\beta}_1,\dots, \hat{\beta}_p)^T$ with $\beta\neq \hat{\beta}$ we have $$E((X'\beta - X'\hat{\beta})^2)>0$$

May we then conclude that also $$E((f(X'\beta) - f( X'\hat{\beta}))^2)>0?$$ whenever $\beta\neq \hat{\beta}$?


I think this should be the case. Since $f$ is invertible it should be sufficient to look at the relationship between $X'\beta$ and $X'\hat{\beta}$. However I am missing an exact argument here.

Note that for some constant $M$, since $f$ is lipschitz, we have $$E((f(X'\beta) - f( X'\hat{\beta}))^2) \leq M E((X'\beta-X'\hat{\beta})^2)$$ however, this inequality points in the wrong direction.

If it is of any help: note that the condition $E((X'\beta - X'\hat{\beta})^2)>0$ already implies that $\Sigma = E(XX^T)$ is invertible.

1

There are 1 best solutions below

0
On

I think I got my argument.

Suppose $$E((f(X'\beta) - f( X'\hat{\beta}))^2)=0$$ then it must be the case that $f(X'\beta)$ and $f( X'\hat{\beta})$ are the same random variable, i.e. $P(f(X'\beta) = f( X'\hat{\beta}))=1$. Since $f$ is invertible, we then have \begin{align*} P(f(X'\beta) &= f( X'\hat{\beta}))=1\\ \Leftrightarrow \quad P(X'\beta&= X'\hat{\beta})=1 \end{align*} But $E((X'\beta- X'\hat{\beta})^2)>0$ whenever $\hat{\beta}\neq \beta $ then leads to the contradiction that $P(X'\beta = X'\hat{\beta})<1$ whenever $\hat{\beta}\neq \beta$.

Is this argumentation correct?