Inequality about Hermite polynomial

75 Views Asked by At

Let $H_n(x)$is a Hermite polynomial, and $k,m,n$ are any positive integers. Prove $$\int_0^\infty e^{-x^2}H_{2k+1}(x)H_{2m+1}(x)H_{2n+1}(x)\frac{dx}x\geq 0.$$ As we know, $$ H_n\left( x \right) =\left( -1 \right) ^ne^{x^2}\frac{d^n}{dx^n}e^{-x^2}, $$ and $$ \left( H_n,H_m \right) =\int_{-\infty}^{\infty}{e^{-x^2}H_n\left( x \right) H_m\left( x \right) dx}=\begin{cases} 0, &n\ne m\\ 2^nn!\sqrt{\pi}, &n=m \end{cases} $$ Maybe it is useful that if $a+b+c=2s$ is an even number, and $s\geq a,s\geq b,s\geq c$, we have $$ \int_{-\infty}^{\infty}{e^{-x^2}H_a\left( x \right) H_b\left( x \right) H_c\left( x \right) dx}=\sqrt{\pi}\frac{2^sa!b!c!}{\left( s-a \right) !\left( s-b \right) !\left( s-c \right) !}, $$ otherwise it is equal to $0$.