Assume $X$ is a Gaussian random variable with pdf: $$f_X(x)=\frac{1}{\sigma \sqrt{2 \pi}} \exp \left( -\frac{1}{2} \left(\frac{x-\mu}{\sigma}\right)^2 \right) $$ and let $Y=|X|$. Now $Y$ is not Gaussian and, as far as I know, the pdf of $Y$ is given by: $$f_Y(y)=\begin{cases} \frac{2}{\sigma \sqrt{2 \pi}} \exp \left( -\frac{1}{2} \left(\frac{y-\mu}{\sigma}\right)^2 \right), \, if \,\,y \ge 0\\ 0, \, otherwise \end{cases}$$ Trying to calculate the expected value of $Y$ by the usual way: $$E(Y)=\int_{0}^{+\infty} y \,f_Y(y) \,dy$$ I find: $$E(Y)=\mu + \sigma \,\sqrt{\frac{2}{\pi}}$$ I'm not going to try to detail the calculations (if you allow me), but can you just tell me if this result is the correct one?
2026-02-25 01:16:20.1771982180
Expected value of $|X|$ when $X$ is random Gaussian
85 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in RANDOM-VARIABLES
- Prove that central limit theorem Is applicable to a new sequence
- Random variables in integrals, how to analyze?
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- Determine the repartition of $Y$
- What is the name of concepts that are used to compare two values?
- Convergence of sequences of RV
- $\lim_{n \rightarrow \infty} P(S_n \leq \frac{3n}{2}+\sqrt3n)$
- PDF of the sum of two random variables integrates to >1
- Another definition for the support of a random variable
- Uniform distribution on the [0,2]
Related Questions in EXPECTED-VALUE
- Show that $\operatorname{Cov}(X,X^2)=0$ if X is a continuous random variable with symmetric distribution around the origin
- prove that $E(Y) = 0$ if $X$ is a random variable and $Y = x- E(x)$
- Limit of the expectation in Galton-Watson-process using a Martingale
- Determine if an Estimator is Biased (Unusual Expectation Expression)
- Why are negative constants removed from variance?
- How to find $\mathbb{E}(X\mid\mathbf{1}_{X<Y})$ where $X,Y$ are i.i.d exponential variables?
- $X_1,X_2,X_3 \sim^{\text{i.i.d}} R(0,1)$. Find $E(\frac{X_1+X_2}{X_1+X_2+X_3})$
- How to calculate the conditional mean of $E(X\mid X<Y)$?
- Let X be a geometric random variable, show that $E[X(X-1)...(X-r+1)] = \frac{r!(1-p)^r}{p^r}$
- Taylor expansion of expectation in financial modelling problem
Related Questions in GAUSSIAN
- How to fit a Gaussian approximation to the likelihood curve at maximum?
- How can I find percentile $P_{10}$ and $P_{90}$ for Normal Distribution with Mean as $100$ and Standard Deviation as $3$?
- Give probability space $(\Omega,F,\mathbb P)$ & random variable $X:\Omega \to \mathbb R$ on $(\Omega,F,\mathbb P)$ so $X$ has normal distribution.
- Analyticity of determinant formula for Gaussian integral
- Searching for a second order ODE whose solution is bell shape (Gaussian function)
- Expectation: sigmoid times mixture of Gaussians
- Joint Gaussian distribution implies Gaussian + independence?
- how was the gaussian distribution developed? (question of an answer already done)
- A uniform distributed random vector on euclidean ball is sub gaussian
- Predictive distribution of SPGP
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Your calculation of $E(Y)$ is incorrect because you've got the wrong density for $Y$. To calculate $E(|X|)$ you can work with the density of $X$ directly: $$ E|X|=\int _{-\infty}^\infty |x| f_X(x)\,dx $$ and then break this integral into two cases $x<0$ and $x>0$. When $x>0$ you've got the integral $$\int_0^\infty xf_X(x)\,dx $$
When $x<0$, the integral you seek is $$ \int_{-\infty}^0 (-x)f_X(x)\,dx $$ which you can handle by the substitution $t:=-x$.
EDIT: To obtain the density $f_Y(y)$, notice that the answer https://math.stackexchange.com/a/2485173/215011 is assuming $\mu=0$ and $\sigma=1$. If you want to follow the approach in that calculation, you'd expand $P(-x<X\le x)$ differently: $$\begin{aligned}P(-x<X\le x)& = P(X\le x) - P(X\le -x) \\&= P\left(\frac{X-\mu}\sigma \le \frac {x-\mu}\sigma\right)-P\left(\frac{X-\mu}\sigma\le\frac{-x-\mu}\sigma\right)\\ &= \Phi\left( \frac {x-\mu}\sigma\right)-\Phi\left(\frac{-x-\mu}\sigma\right) \end{aligned} $$ and differentiate that wrt $x$.