Here is a statement that I would like to prove:
Let $X \sim \mathcal{N}(\mu, 1)$. Let $\nu >0$, show that $$ -\log \mathbb{E}\left[\exp\left(-|X|^\nu\right)\right] \quad\underset{\mu \to +\infty}{\sim}\quad \mu^\nu $$ where $f(x) \sim g(x)$ means $f(x) = g(x) + o(g(x))$.
I have checked numerically that it is always verified whatever $\nu>0$, but I cannot prove it.
Attempt of proof: Here is what I have tried: \begin{align} \mathbb{E}\left[\exp\left(-|X|^\nu\right)\right] = \sum_{k=0}^\infty \frac{(-1)^k}{k!}\mathbb{E}\left[|X|^{\nu k}\right] \end{align} We have, for any $p>0$ [see here] $$ \mathbb{E}\left[|X|^{p}\right] = \frac{2^{\frac{p}{2}} \Gamma[\frac{1}2 +\frac{p}{2}]}{\sqrt{\pi}} M\left(-\frac{p}2, \frac12, -\frac{\mu^2}2\right) $$ where M is the Kummer function. And we have [see here] $$ M\left(-\frac{p}2, \frac12, -\frac{\mu^2}2\right) \quad\underset{|\mu| \to \infty}{\sim}\quad \frac{\Gamma(1/2) \left(\frac{\mu^2}2\right)^{\frac{p}2}}{\Gamma(\frac12 + \frac{p}2)} $$ Pluging these two together give $$ \mathbb{E}\left[|X|^{p}\right] \quad\underset{\mu \to +\infty}{\sim}\quad \mu^p $$ Now I would like to conclude that \begin{align} \mathbb{E}\left[\exp\left(-|X|^\nu\right)\right] &\quad\underset{\mu \to +\infty}{\sim}\quad \sum_{k=0}^\infty \frac{(-1)^k}{k!}\mu^{k \nu} = \exp\left(-\mu^{\nu}\right) \end{align} but I cannot because I have no control on the error term with respect to $k$ (in order to use dominated or monotone convergence arguments). Maybe this reference could help: here.
Note 1: I have also tried using the Delta method, but I did not succeed.
Note 2: In fact the statement that I need is a bit weaker: $$ \log\left[C_\nu-\log \mathbb{E}\left[\exp\left(-|X|^\nu\right)\right] \right]\quad\underset{\mu \to \infty}{\sim}\quad \nu \log \mu $$ where $C_\nu$ is a constant ensuring the quantity inside the outer log to be positive when $|\mu|>0$.
Here is an argument for $\leq$:
Fix $v>0$. Define $f:[0,\infty)\rightarrow\mathbb{R}$ by $f(y) = \exp(-y^v)$. Then: \begin{align} f'(y) &= -vy^{v-1} \exp(-y^v)\\ f''(y) &= [(vy^{v-1})^2 + -v(v-1)y^{v-2}]\exp(-y^v) \end{align} Notice that if $0 < v \leq 1$ then $f$ is a convex function over the domain $y \geq 0$. Further, if $v>1$, there is a threshold $\theta>0$ such that $f$ is convex over $[\theta, \infty)$.
Case $0 < v \leq 1$:
By convexity of $f$ for this case we get by Jensen's inequality: $$ E[\exp(-|X|^v)] = E[f(|X|)] \geq f(E[|X|]) = \exp(-E[|X|]^v) $$ Taking $-\log()$ of both sides gives: $$ \boxed{-\log(E[\exp(-|X|^v)]) \leq E[|X|]^v} $$ In fact this holds for any random variable $X$ provided that $E[|X|]$ is finite. Notice that if $X$ is $N(\mu,1)$ and $\mu$ is large then $E[|X|]\approx \mu$. This is because: $$ |X| = X - 2X1\{X<0\} \implies E[|X|] = \mu - 2E[X1\{X<0\}] $$ and $E[X1\{X<0\}]$ is very small when $\mu\rightarrow\infty$.
Case $v>1$:
Recall that $f(y)$ is convex over $y \in [\theta, \infty)$. Define the event $A=\{|X|\geq \theta\}$. Then taking expectations conditioned on $A$ we get by a similar argument: $$ \boxed{-\log(E[\exp(-|X|^v) \: |A]) \leq E[|X| \: | A]^v} $$ Now when $\mu\rightarrow \infty$ we get $P[A]\rightarrow 1$ and also $E[|X|] \approx \mu$ for large $\mu$.