Orlicz norms: equivalence and interpretation

156 Views Asked by At

I am studying Orlicz norms and the definition that I know, so far, is the one present in Vershynin's book "High dimensional probability" but I would like to understand if another definition of Orlicz norm is equivalent or related in some sense. Specifically:

Def. A function $\psi: [0, +\infty] \rightarrow [0, +\infty)$ is called an Orlicz function if $\psi$ is convex, increasing and such that $\psi(0)=0$, $\psi(x) \rightarrow +\infty$ as $ x \rightarrow +\infty$ .

Then, Vershynin defines the Orlicz norm as:

Def. Given an Orlicz function $\psi$, the Orlicz norm for a random variable is defined as: $\| X \|_{\psi} = \inf \{t>0 : \mathbb{E} \psi\left(\frac{|X|}{t}\right) \leq 1\}$

This definition is clear to me and also how it incorporates subgaussian random variables, exponential, etc.

I have come across another definition of Orlicz norm, though, and I don't understand it they are equivalent or a completely different thing. After some calculation, I would say that they are not, but it seems strange.

The definition is the following: Def. For p>0 define: $\varphi_p(x) = |1+x|^p$. If X is a random variable we define $\phi_p(X) = \mathbb{E}\varphi_p(x)$

Def. The Orlicz norm is defined as: $\|X\|_p = \inf \{ t>0: \ln \left(\phi_p\left(\frac{X}{t}\right)\right) \leq p \}$

I have found this definition in a paper by Latala (https://projecteuclid.org/journals/annals-of-probability/volume-25/issue-3/Estimation-of-moments-of-sums-of-independent-real-random-variables/10.1214/aop/1024404522.full)

Do you have any intuition for what the second definition represents and if they are equivalent to the first one? Thank you in advance!