Let the function $g(X) := inf_{a \in \mathbb{R}} \{a + E[ \max(X-a, 0)] \}$, where $X$ is a random variable of finite expectation and variance and $\|\cdot\|_2$ is the $2$-norm in $\mathcal{L}^2$. It is already established that $g$ is convex in X for any $a \in \mathbb{R}$.
I am hoping to show that if $X_k$ is a sequence of random variables such that \begin{align} \|X_k -X\|_2 \rightarrow 0, \end{align} where $g(X_k) \leq 0$ for each $k$,it holds that: \begin{align} g(X) \leq 0. \end{align}
I assmue that a proof would involve Hölder's inequality ut utilize the fact that $E[|X_k-X|] \longrightarrow 0$ together with Jensen's inequality, using the fact that $g$ is convex, but I can't seem to find the correct application.