$\mathbf{X}$ and $\mathbf{Y}$ are each dependent normal random variable, then how can we derive like this one? $$\mathbf{E}\{e^{\mathbf{X}}e^{\mathbf{Y}}\}$$ I know the each first moment is $\mathbf{E}\{e^{\mathbf{X}}\}=exp(m_{\mathbf{X}}+\frac12\sigma^2_{\mathbf{X}})$ so if their are independent(no correlation) that equation changes like $$\mathbf{E}\{e^\mathbf{X}e^\mathbf{Y}\}=\mathbf{E}\{e^\mathbf{X}\}\mathbf{E}\{e^\mathbf{Y}\}=exp(m_\mathbf{X}+m_\mathbf{Y})exp(\frac12(\sigma^2_\mathbf{X}+\sigma^2_\mathbf{Y})).$$ But if their are dependent each other with correlation coefficient $\rho$, that equation changes like $$\mathbf{E}\{e^\mathbf{X}e^\mathbf{Y}\}=exp(m_\mathbf{X}+m_\mathbf{Y})exp(\frac12(\sigma^2_\mathbf{X}+\sigma^2_\mathbf{Y}+2\rho\sigma_\mathbf{X}\sigma_\mathbf{Y})).$$ I don't know how to explain why the component $2\rho\sigma_\mathbf{X}\sigma_\mathbf{Y}$ is added because derivation of that equation is too long. I hope you advise me just appropriate textbook or papers including that problem. Thank you.
2026-03-31 17:48:03.1774979283
How can we derive expectation of two dependent normal distribution?
131 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in NORMAL-DISTRIBUTION
- Expectation involving bivariate standard normal distribution
- How to get a joint distribution from two conditional distributions?
- Identity related to Brownian motion
- What's the distribution of a noncentral chi squared variable plus a constant?
- Show joint cdf is continuous
- Gamma distribution to normal approximation
- How to derive $E(XX^T)$?
- $\{ X_{i} \}_{i=1}^{n} \thicksim iid N(\theta, 1)$. What is distribution of $X_{2} - X_{1}$?
- Lindeberg condition fails, but a CLT still applies
- Estimating a normal distribution
Related Questions in EXPECTATION
- Prove or disprove the following inequality
- Show that $\mathbb{E}[Xg(Y)|Y] = g(Y) \mathbb{E}[X|Y]$
- Need to find Conditions to get a (sub-)martingale
- Expected Value of drawing 10 tickets
- Martingale conditional expectation
- Variance of the integral of a stochastic process multiplied by a weighting function
- Sum of two martingales
- Discrete martingale stopping time
- Finding statistical data for repeated surveys in a population
- A universal bound on expectation $E[X^ke^{-X}]$
Related Questions in CORRELATION
- What is the name of concepts that are used to compare two values?
- Power spectrum of field over an arbitrarily-shaped country
- How to statistically estimate multiple linear coefficients?
- How do I calculate if 2 stocks are negatively correlated?
- A simple question on average correlation
- Two random variables generated with common random varibales
- Correlation of all zero rows and columns in matrix
- Calculating correlation matrix from covariance matrix - r>1
- Joint probability of (X+Z) $\land$ (Y+Z) for arbitrary distribution of X, Y, Z.
- Phase space: Uncorrelated Gaussian Summed With a Linearly Correlated Gaussian
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Consider $Z = a X + b Y$ for some yet unknown coefficients $a$ and $b$. $Z$ is Gaussian random variable as a linear combination of jointly Gaussian random variables $X$ and $Y$.
Moreover $X$ and $Z$ are jointly Gaussian, since $(X,Z)^T = \begin{pmatrix} 1 & 0 \cr a & b \end{pmatrix} (X,Y)^T$. Since $X$ and $Z$ are jointly Gaussian, their distribution is determinated by their means, and covariance matrix. We will attempt to choose indeterminates $a$ and $b$ so that $X$ and $Z$ are independent, i.e. $\operatorname{Cov}(X,Z)=0$: $$ \mathbb{Cov}(X,Z) = a \sigma_X^2 + b \rho \sigma_X \sigma_Y = \sigma_X \left( a \sigma_X + b \rho \sigma_Y \right) $$ Pick a solution, for example $b = 1$ and $a = -\rho \frac{\sigma_Y}{\sigma_X}$.
Then $Y = \frac{Z-a X}{b}$ and the expectation you seek becomes easy $$ \mathbb{E}\left(\mathrm{e}^X \mathrm{e}^Y\right) = \mathbb{E}\left(\mathrm{e}^{X (1-a/b)} \mathrm{e}^{Z/b}\right) = \mathbb{E}\left(\mathrm{e}^{X (1-a/b)} \right) \mathbb{E}\left( \mathrm{e}^{Z/b}\right) = \exp\left( \mathbb{E}\left(X(1-a/b)\right) + \frac{1}{2} \mathbb{Var}\left(X(1-a/b)\right)\right) \cdot \exp\left( \mathbb{E}\left(Z/b\right) + \frac{1}{2} \mathbb{Var}\left(Z/b\right)\right) $$ It remains to fill in rather tedious algebra: $$ \mathbb{E}\left(X(1-a/b)\right) = \mathbb{E}\left(X\right) (1-a/b) = m_X + \rho \frac{\sigma_Y}{\sigma_X} m_X $$ $$ \mathbb{E}(Z/b) = \frac{a}{b} \mathbb{E}(X) + \mathbb{E}(Y) = m_Y - \rho \frac{\sigma_Y}{\sigma_X} m_X $$ $$ \mathbb{Var}\left(X(1-a/b)\right) = (1-a/b)^2 \mathbb{Var}(X) = \sigma_X^2 \frac{(\sigma_X + \rho \sigma_Y)^2}{\sigma_X^2} = (\sigma_X + \rho \sigma_Y)^2 $$ $$ \mathbb{Var}\left(Z/b\right) = \frac{1}{b^2} \mathbb{Var}(Z) = \frac{1}{b^2} \left( a^2 \sigma_X^2 + b^2 \sigma_Y^2 + 2 \rho a b \sigma_X \sigma_Y \right) = \sigma_Y^2 \left(1 - \rho^2\right) $$
Combining these terms give the desired result: