suppose $X_1,X_2,\ldots,X_n \sim \mathcal N(0,\sigma^2)$. How can I calculate $$\mathbb E\left(\exp\left(\frac{1}{2}\sum_{i=1}^n X_i^2\right)\right)$$
2026-03-30 09:31:23.1774863083
calculating $\mathbb E\left(\exp\left(\frac{1}{2}\sum_{i=1}^n X_i^2\right)\right)$
118 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in STATISTICAL-INFERENCE
- co-variance matrix of discrete multivariate random variable
- Question on completeness of sufficient statistic.
- Probability of tossing marbles,covariance
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Using averages to measure the dispersion of data
- Confidence when inferring p in a binomial distribution
- A problem on Maximum likelihood estimator of $\theta$
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Show that $\max(X_1,\ldots,X_n)$ is a sufficient statistic.
Related Questions in PARAMETER-ESTIMATION
- Question on completeness of sufficient statistic.
- Estimate the square root of the success probability of a Binomial Distribution.
- A consistent estimator for theta is?
- Estimating the mean of a Poisson distribution
- A problem on Maximum likelihood estimator of $\theta$
- The Linear Regression model is computed well only with uncorrelated variables
- Derive unbiased estimator for $\theta$ when $X_i\sim f(x\mid\theta)=\frac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$
- Is there an intuitive way to see that $\mathbb{E}[X|Y]$ is the least squares estimator of $X$ given $Y$?
- Consistent estimator for Poisson distribution
- estimation of $\mu$ in a Gaussian with set confidence interval
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
One of the comments addresses the solution. But, I will outline the logic and show the direct computation which is not immediate. You have not assumed anything about the $X_i$, $1 \leq i \leq n$ except that they are identically $N(0, \sigma^2)$. I assume you mean also to assume they are independent since you assert they come from a univariate normal distribution.
Note that by property of exponentials, $$e^{\frac{1}{2} \sum\limits_{i=1}^{n} X_i^2} = \prod\limits_{i=1}^{n} e^{\frac{1}{2}X_i^2}$$ Recall that if $X, Y$ are two independent random variables then, $$\mathbb{E}[h(X)h(Y)]= \mathbb{E}[h(X)]\mathbb{E}[h(Y)]$$ Which gives us that, $$\mathbb{E}\Big[e^{\frac{1}{2} \sum\limits_{i=1}^{n} X_i^2}\Big] = \mathbb{E}\Big[\prod\limits_{i=1}^{n} e^{\frac{1}{2}X_i^2}\Big] = \prod\limits_{i=1}^{n} \mathbb{E}\big[e^{\frac{1}{2} X_i^2}\big]$$
Therefore, we need only compute $\mathbb{E}\big[e^{\frac{1}{2} X_i^2}\big]$.
Note that if $Z \sim N(0,1)$ then $Z^2 \sim \chi_1^2$ and $\mathbb{E}[Z^2] = 1$. We can write $\frac{1}{2}X_i^2 = \big(\frac{X_i}{\sqrt{2}}\big)^2$.
Define $X^{'}_i = \frac{X_i}{\sqrt{2}}$ so that $X_i^{'} \sim N(0,\frac{\sigma^2}{ 2})$. Then, we can scale $X_i^{'}$ by $X_i^{'} = \frac{\sigma}{\sqrt{2}} Z_i$ where $Z_i \sim N(0,1)$. Therefore, $X_i^{'2} = \frac{\sigma^2}{2} Z_i^2$. We know $Z_i^2 \sim \chi_1^2$ and for $\frac{\sigma^2}{2} > 0$, $\frac{\sigma^2}{2} Z_i^2 \sim \Gamma\big(\frac{1}{2},\sigma^2\big)$ Chi-Square Dist.
Note that the moment generating function $M_X(t) := \mathbb{E}[e^{t X}]$. Since $X^{'2} \sim \Gamma\big(\frac{1}{2}, \sigma^2\big)$. We have that,
$$M_{X^{'2}}(t) = (1 - \sigma^2 t)^{-\frac{1}{2}}$$
Which can be looked up Here.
If I let $t = 1$, then,
$$M_{X^{'2}}(1) = \mathbb{E}[e^{X^{'2}}] = (1 - \sigma^2)^{-\frac{1}{2}}$$
This proves that, $$\mathbb{E}\Big[e^{\frac{1}{2} \sum\limits_{i=1}^{n} X_i^2}\Big] = \prod\limits_{i=1}^{n} (1 - \sigma^2)^{-\frac{1}{2}} = (1 - \sigma^2)^{-\frac{n}{2}}$$
Note that the Gamma MGF is only defined for $0 < \sigma^2 < 1$ for $t = 1$. In the derivation of the Gamma MGF, one realizes this restriction is necessary or else the integral is infinite. That is, this restriction gives you the range for $\sigma^2$ so that the expectation is finite.