Use Jensen's inequality to show $\underset {\theta}{\operatorname {max}} \mathbb E[L(\theta, \delta(X)]\ge ...$

209 Views Asked by At

Consider binary random vector $X \in \{0, 1\}^n$. Consider the most general model for such a random vector $\Omega = \{ \theta=(\theta_x)_{x \in\{0,1\}^n} | \theta_x\ge0, \forall x \in \{0, 1\}^n, \sum \theta_x =1\}$. Each $\theta \in \Omega$ can be viewed as a vector in $\mathbb R^{2^n}$ whose elements are greater than or equal to 0 and sum to 1. Consider the loss function $L(\theta, a)=\underset {x}{max}|\theta_x - a_x|=\Vert \theta-a \Vert_\infty$.

Show that for any decision rule $\delta:\{0,1\}^n \rightarrow\Omega$,

$\underset {\theta}{\operatorname {max}} \mathbb E[L(\theta, \delta(X)]\ge \underset {\theta'}{min} \underset {\theta}{max}\Vert\theta-\theta'\Vert_\infty$

Hint: use Jensen's inequality and note that ${max}_i|z_i|$ is convex.

Current work: Jensen's inequality states that $f(EX)\le Ef(x)$.

$\displaystyle \max_{\theta\in\Omega}EL(\theta,\delta(X))=\max_{\theta\in\Omega}E\max_{x\in\{0,1\}^n}|\theta_x-\delta(X)_x|\ge\max_{\theta\in\Omega}\max_{x\in\{0,1\}^n}|\theta_x-E\delta(X)_x|\ge\min_{\theta'\in\Omega}\max_{\theta\in\Omega}\max_{x\in\{0,1\}^n}|\theta_x-\theta'_x|=\min_{\theta'\in\Omega}\max_{\theta\in\Omega}\Vert \theta-\theta'\Vert_\infty$

Are these steps valid? Mainly the first inequality which is an application of Jensen's inequality.