Let $X$ be a random variable such that $M_X(t)=e^t M_X(-t)$. Find $E(X)$ and $E\left(X^2\right)$.
I know the general procedure that to find the $n$th moment, the $n$th order derivative needs to be taken and then t must be set to 0.
$M_X(t)^{\prime}=e^t M_X(-t)-e^t M_X^{\prime}(-t)$
The solution is $E[X] = \frac{1}{2}$
Question 1:
For the second order moment:
\begin{aligned} &M_X^{(2)}(t)=e^t\left(M_X^{(2)}(-t)-2 M_X^{(1)}(-t)+M_X(-t)\right) \\ &M_X^{(2)}(0)=M_X^{(2)}(0)-2 M_X^{(1)}(0)+M_X(0) \end{aligned}
If I differentiate two times, It seems like there is no expression for the second derivative. Does this mean that $E[X^2]$ is undefined?
Note that any variable with a finite MGF in a neighborhood of $0$ must have finite moments of all orders.
Let $Y=X-1/2$. Then the hypothesis is equivalent to $M_Y(t)=M_Y(-t)$ for all $t$, which in turn is equivalent to $Y$ and $-Y$ having the same law. Thus $E(Y)=-E(Y)=0$, but there is no information on the variance of $Y$ (which equals the variance of $X$) except that it is finite.
For example, $Y$ could have $N(0,\sigma^2)$ distribution for any $\sigma \ge 0$. Thus all one can say about $E(X^2)$ is that $E(X^2) \ge (E X)^2=1/4.$