Here is the simplest statement of my question:
Let $Y$ be a centered real random variable and define $$\|Y\|_* = \sup \left\{ \mathbb{E}[X \cdot Y] ~:~ \forall t \in \mathbb{R} ~~ \mathbb{E}[e^{tX}] \le e^{t^2/2}\right\},$$ where the supremum is over real random variables $X$ that may depend on $Y$.
Is there a closed-form expression for $\|Y\|_*$? (Or a good closed-form approximation.)
Here is a more detailed statement of my question:
Define a norm on the space of random variables by $$\|X\| := \inf \left\{ \max\{|\mu|,|\sigma|\} : \mu,\sigma \in \mathbb{R},~~\forall t \in \mathbb{R} ~~~ \mathbb{E}\left[e^{tX}\right] \leq e^{t\mu+t^2\sigma^2/2} \right\}.$$
If $\|X\|$ is finite, then $X$ is said to be subgaussian. The norm is scaled to have the property $\|\mathcal{N}(\mu,\sigma^2)\|= \max\{|\mu|,|\sigma|\}$. By Hoeffding's lemma, we have $$\|X\| \leq \|X\|_\infty := \inf\{\tau:\mathbb{P}[|X|\leq\tau]=1\},$$ i.e., bounded random variables are also subgaussian.
I'm interested in the dual norm, defined by $$\|Y\|_* := \sup \left\{ \mathbb{E}[X \cdot Y] : X \text{ is a random variable satisfying } \|X\| \leq 1 \right\}.$$ Of course, $X$ and $Y$ are not independent in the above supremum.
Is there a simple expression for the dual norm $\|\cdot\|_*$? I would like to be able to calculate $\|\cdot\|_*$ and the definition above is not useful. Even a good approximation to the dual norm would be helpful.
Below are various things I know or think about this question, which may be helpful for answering it.
My intuition is that $\|\cdot\|\approx\|\cdot\|_\infty$, as, in my experience, most properties of bounded random variables extend to subgaussian random variables. Since the $1$-norm is the dual of the $\infty$-norm, my intuition is that $\|\cdot\|_*\approx\|\cdot\|_1$.
This intuition can be made a bit more formal by looking at $p$-norms, as follows. It is easy to show that $$\|X\|_p := \mathbb{E}[|X|^p]^{1/p} \leq (\sqrt{p}+2) \cdot \|X\|$$ for all $p \in [1,\infty)$ and all subgaussian $X$. Thus, by Hölder's inequality, for all $p \in (1,\infty)$ and all random variables $X$ and $Y$, $$\mathbb{E}[X \cdot Y] \leq \|X\|_p \cdot \|Y\|_{1+\frac{1}{p-1}} \leq O(\sqrt{p}) \cdot \|X\| \cdot \|Y\|_{1+\frac{1}{p-1}}.$$ Hence $\|Y\|_* \leq O\left(\frac{1}{\sqrt{\varepsilon}}\right) \cdot \|Y\|_{1+\varepsilon}$ for all $\varepsilon > 0$. Since $\|X\| \leq \|X\|_\infty$, we also have $\|Y\|_* \geq \|Y\|_1$.
Here is an example that "breaks" this intuition. However, it only slightly breaks it, which is why I think the intuition is still correct. Let $X$ be a standard Gaussian and $Y=\mathsf{sign}(X) \cdot e^{X^2/2}/(1+X^2)$. Then $\|X\|=1$, but $\|X\|_\infty = \infty$. And $\|Y\|_1 = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^\infty \frac{1}{1+x^2} \mathrm{d}x = \sqrt{\frac{\pi}{2}}$, while $\|Y\|_* \geq \mathbb{E}[XY] = \frac{1}{\sqrt{2\pi}}\int_{-\infty}^\infty \frac{|x|}{1+x^2} \mathrm{d}x = \infty$. However, note that $\mathbb{E}[|Y|\log|Y|]=\infty$, so one only needs something "slightly larger" than the $1$-norm for this example. My intuition is that, in general, $\|\cdot\|_*$ is only slightly larger than $\|\cdot\|_1$.
My guess is that the answer is something asymptotically like $\|Y\|_* \overset{?}{=} \mathbb{E}\left[|Y|\sqrt{\log(1+|Y|)} \right]$. Has anyone seen a norm like this before? (I can show that, if $\mathbb{E}\left[|Y|\sqrt{\log(1+|Y|)} \right]=\infty$, then $\|Y\|_*=\infty$.)
I can prove the following upper bound on the dual norm. This is the strongest bound I have been able to prove so far. $$\|Y\|_* \leq \sqrt{2} \mathbb{E}[|Y|] + 4\sqrt{\mathbb{E}[|Y|] \cdot \left(\mathbb{E}[|Y|\log|Y|] - \mathbb{E}[|Y|]\log\mathbb{E}[|Y|]\right)}.$$ Note that by Jensen's inequality and the convexity of $x \mapsto x \log x$, we have $\mathbb{E}[|Y|\log|Y|] \geq \mathbb{E}[|Y|]\log\mathbb{E}[|Y|]$. So the right hand side of the above bound is well-defined and non-negative. Multiplying $Y$ by a constant also multiplies the expression by that constant. So this expression is almost a norm, although I don't know if it satisfies the triangle inequality.
Furthermore, if $\mu=\mathbb{E}[Y]$, then $\|Y\|_* = |\mu| + \|Y-\mu\|_*$. This centering can also be combined with the above bound.