Relation Between Subgradients of a Random Function and Its Expectation

96 Views Asked by At

Suppose $\mathcal{X}\subset\mathbb{R}^n$ is a convex set. Let $f:\mathcal{X}\times\mathbb{R}^m\to\mathbb{R}$ be a function such that for every $y\in\mathbb{R}^m$, the function $f(\cdot,y)$ is convex, that is, $$ f(\lambda x_1+(1-\lambda)x_2,y)\le \lambda f(x_1,y) + (1-\lambda)f(x_2,y),\quad\forall x_1,x_2\in\mathcal{X},\quad\forall \lambda\in[0,1]. $$ Then there exists $g:\mathcal{X}\times\mathbb{R}^m\to\mathbb{R}^n$ such that for all $(x,y)\in\mathcal{X}\times\mathbb{R}^m$, $g(x,y)$ is a subgradient of $f(x,y)$.

Now suppose $Y$ is a random variable taking values in $\mathbb{R}^m$. Clearly the function $F(x)=\mathbb{E}_Yf(x,Y)$ is convex, so there exists $G:\mathcal{X}\to\mathbb{R}^n$ such that for all $x\in\mathcal{X}$, $G(x)$ is a subgradient of $F(x)$.

My question: Can we say anything about the relation between $\mathbb{E}_Yg(x,Y)$ and $G(x)$?

In the special case where both $f(\cdot,y)$ and $F$ are differentiable, as long as we can exchange differentiation and integration, we have $\mathbb{E}_Y\nabla_x f(x,Y)=\nabla F(x)$. But what if one of $f(\cdot,y)$ and $F$ is non-differentiable?

1

There are 1 best solutions below

0
On

Since, $$f(u,y) \geq f(x,y) + g(x,y)^T(u-x)$$ we have that, $$E_Y(f(u,y)) \geq E_Y(f(x,y) + g(x,y)^T(u-x))$$ $$E_Y(f(u,y)) \geq E_Y(f(x,y)) + E_Y(g(x,y))^T(u-x)$$ So $E_Y(g(x,Y))$ is a sub-gradient.

If subgradient of $E_Y(f(x,Y))$ which is $G(x)$ is unique then it has to be the case that $E_Y(g(x,Y)) = G(x)$.

On the other hand if subgradient of $E_Y(f(x,Y))$ is not unique then it is not clear what the function $G(x)$ is i.e., which of the possible sub-gradient you are referring to as $G(x)$.