Multivariate Hermite Polynomials and Covariance Matrix

697 Views Asked by At

Let $X=(X_1,\dots,X_n)$ be a centered Gaussian random vector, i.e. $X\sim \mathcal{N}(0,\Sigma)$, where $\Sigma$ is the covariance matrix. Let $P_{\Sigma}(x)= (2\pi)^{-n/2}|\Sigma|^{-1/2} \exp \{-\frac{1}{2} x^\ast \Sigma^{-1} x\}$, $x\in \mathbb{R}^n$, $|\Sigma|$ denotes the determinant and $x^\ast$ denotes transposition. Let $\alpha=(\alpha_1,\dots,\alpha_n)$, $\alpha_i\in \mathbb{N}$ be a multiindex. Denote $x^\alpha = x_1^{\alpha_1}\cdots x_n^{\alpha_n}$.

We define the Hermite polynomials of order $\alpha$ as $$H_{\alpha}(x,\Sigma)= P_{\Sigma}(x)^{-1} \left(-\frac{\partial}{\partial x}\right)^{\alpha} P_{\Sigma}(x),\quad x\in \mathbb{R}^n.$$

I would like to know/show that $$|H_{\alpha}(z,\Sigma)| \leq C |\Sigma^{-1}|^{k} |z|^k,$$ for some large enough $k=k(\alpha)\geq 0$ and constant $C$ which may also depend on $\alpha$ or not.

Another way to pose the question is whether one can fins a constant such that $$\int_{\mathbb{R}^n} \left|\left(\frac{\partial}{\partial x}\right)^{\alpha} P_{\Sigma}(x)\right| dx \leq C |\Sigma^{-1}|^{k},$$ for some large enough $k=k(\alpha)\geq 0$ and constant $C$ which may also depend on $\alpha$ or not. In the one-dimensional case this is true.

Is this true? In the one dimensional case it seems to be true and easy to show. What about the higher dimensional case? Thanks a lot for any hints or help!