Let $X:=(X_1\dots X_k)$ be a $k$-dimensional multivariate Gaussian random vector with mean $\mu,$ covariance matrix $\Sigma,$ that's positive definite. Consider it's PDF
$p_X(x):=$
We know that $\forall v\in \mathbb{R}^k, \forall b \in \mathbb{R},$ the random variable $W:=<X,v>+b$ is a random variable. So this means that the translated projections of multivariate Gaussians random vectors are Gaussian random variables - this is clear.
My questions is:
What if instead of projections of $X$, we try to restrict the Gaussian PDF of $X$?
- Consider the function $\gamma: \mathbb{R}\to\mathbb{R}^k, \gamma(t):=ta+b, a, b \in \mathbb{R}^k,$ that represent a straight line not necessarily passing through the origin. Now let's restrict the PDF of $X$ onto this line. So consider $q:\mathbb{R}\to (0, \infty):= q(t):=p_X(\gamma(t))=p_X(ta + b).$ Intuitively by looking at the graph of $q,$ it seems to me that $q$ should represent the constant multiple of a Gaussian distribution in one dimension (thanks Kurt G. for his comment), see the screenshot of the drawing:
My calculations are as follows:
$-\frac{1}{2}(ta+b-\mu)^{T}\Sigma^{-1}(ta+b-\mu)=-\frac{1}{2}\left(t^2a^{T}\Sigma^{-1}a + 2ta^{T}\Sigma^{-1}b + b^{T}\Sigma^{-1}b\right) $
So, calling $P_2(t)=(ta+b-\mu)^{T}\Sigma^{-1}(ta+b-\mu)=t^2a^{T}\Sigma^{-1}a + 2ta^{T}\Sigma^{-1}b + b^{T}\Sigma^{-1}b$
So we can always write $q(t)$ as $Ce^{-\frac{1}{2}P_2(t)}, P_2(t)$ is a quadratic polynomial with positive coefficient ($a^{T}\Sigma^{-1}a$ above) of $t^2$, $C>0$ a constant. Thus $q(t)$ should be a constant multiple of a Gaussian PDF - correct me if I'm wrong? P.S. I'm not saying that this new PDF will have mean zero.
- If $q$ does represent the constant multiple of Gaussian say $G$ in one dimension, then is there a way to connect $G$ with $W=<X,v> + b =X^{T}v + c$ for some $v\in \mathbb{R}^k, c\in \mathbb{R}?$ Put somewhat generally, is there a way we can connect the two concepts: (i) restricting the PDF of a Gaussian random vector and (ii) projecting and translating that vector?
- Motivated by question 1, I'm tempted to define a multivariate gaussian as a random vector in $\mathbb{R}^k$ whose PDF, restricted to straight lines in $\mathbb{R}^k,$ are constant multiple times some real valued Gaussian PDF. Will this be a wrong alternate definition? In essence, I'm thinking that this theorem is true: let $f:\mathbb{R}^k\to [0,\infty)$ be a PDF so that $\forall a,b\in \mathbb{R}^k,$ the function $q(t):=f(ta+b)$ is proportional to some $e^{-\frac{1}{2}{P_2(t)}}, P_2(t)$ is a quadratic in $t$ with positive coefficient of $t^2.$ Then $f$ must be a Gaussian PDF in $\mathbb{R}^k.$ Is this true?


Too long for a comment.
In my first point I meant $q(t_0-t)=q(\color{red}{t-t_0})\,.$ You need $q=p\circ\gamma$ to be a quadratic function of $t-t_0\,.$ Take $$ p(x,y)=\frac{1}{2\pi\sqrt{1-\rho^2}}\exp\Big(-\frac{x^2-2xy\rho+y^2}{2(1-\rho^2)}\Big)\,,\quad\gamma(t)={t\choose 1}\,. $$ Plugging $\gamma(t)$ into $p$ turns the numerator in the exponential into the polynomial $$ -t^2+2t\rho-1\,. $$ If there were a $t_0$ such that this takes the form $(t-t_0)^2$ it could only have one double root but this is only the case for $\rho=\pm 1\,.$