Efficient numerical computation of expectation of a discontinuous function of a normally distributed random variable

56 Views Asked by At

I am aware of the fact that whenever $f$ is a smooth function, expectations of the form $$ \mathbb{E}[f(X)] $$ can be efficiently computed numerically with the help of Gauss-Hermite Quadrature when X is normally distributed. My question is whether there is an extension of the Gauss-Hermite Quadrature when there is a jump in $f$ so $$ f(x) = \begin{cases} f_1(x) &\quad \text{if } x<\bar{x},\\ f_2(x) &\quad \text{if } x\geq \bar{x}\\ \end{cases}$$ where $\bar{x}$ is known and $f_1$ and $f_2$ are both smooth functions of $x$. In other words: starting from a Gaussian quadrature rule, is it possible to add a node at an arbitrary location in a consistent way? If yes, how to adjust the weights? Of course I could just fall back to a Newton-Cotes sort of method and include an extra node for $\bar{x}$, but I would prefer a more efficient way if there exists one.

Note in case it matters for the answer: I would actually need to apply this in more dimensions, so I am interested in approximating $$ \mathbb{E}[f(g(X))], $$ where $g:\mathbb{R^3}\rightarrow \mathbb{R}$ is smooth function, $f:\mathbb{R}\rightarrow \mathbb{R}$ is like above and $X$ is a three dimensional vector of independent random variables following normal distributions. Also note that I cannot hope for an analytical solution, as both $f_1$ and $f_2$ are interpolated.

1

There are 1 best solutions below

0
On BEST ANSWER

You can always solve the "tower equations" (so-called by Stoer and Bulirsch) $$ \sum_{n=0}^{N-1}w_iH_j(x_i) = \langle H_0(x), H_0(x)\rangle \delta_{j,0} $$ for any set of abscissa $x_i$. This gives exact integration for any polynomial of order $N-1$. However if the abscissa are no longer the roots of $H_N(x)$, then the technique is no longer exact for any other polynomials. Recall that by using the roots, Gauss-Hermite quadrature is exact for all polynomials up to $2N-1$, this property is not present when choosing another set of abscissa.

If you move all of the roots a distance $\varepsilon$ away from the roots, then the error in the polynomials of order $>N-1$ can be shown to be proportional to $\varepsilon^N$ (https://doi.org/10.3905/jod.2021.1.130).

By moving all of the abscissa a distance $\varepsilon$ away from the roots, the equations to solve for the modified weights is $$ \sum_{i=0}^{n-1} (w_i + \delta w_i) H_j(x_i + \epsilon) = \delta_{j0} $$ where $w_i$ solve the original tower equations, and I have chosen the Statistician's Hermite polynomials that are normalized $$ \int_{-\infty}^\infty H_0(x)H_0(x)e^{-\frac{x^2}{2}}\frac{dx}{\sqrt{2\pi}} = 1, $$ which is appropriate here since you are looking at expectations over normally distributed variables.
The equation that solves for the weight shifts $\delta w_i$ are the following $$ \sum_{i=0}^{n-1} \delta w_i H_j(x_i) = (-1)^j\epsilon^j - \delta_{j0}. $$ For $N=2$ this produces $$ \delta w_i \in \left[ -\frac{1}{2}\epsilon, \frac{1}{2}\epsilon\right] $$ resulting in the quadrature rule $$ \mathbb{E}[f(x)] \approx \left(\frac{1}{2}+\frac{1}{2}\epsilon\right)f(-1+\epsilon) + \left(\frac{1}{2}-\frac{1}{2}\epsilon\right)f(1+\epsilon) $$ anb for $N=3$ this produces $$ \delta w_i \in \left[ \frac{1}{2\sqrt{3}}\epsilon+\frac{1}{6}\epsilon^2, -\frac{1}{3}\epsilon^2, - \frac{1}{2\sqrt{3}}\epsilon+\frac{1}{6}\epsilon^2 \right] $$ resulting in the quadrature rule $$ I[f] \approx\left(\frac{1}{6}+\frac{1}{2\sqrt{3}}\epsilon+\frac{1}{6}\epsilon^2\right)f(-\sqrt{3}+\epsilon) + \left(\frac{2}{3} - \frac{1}{3}\epsilon^2\right)f(\epsilon) \\+ \left(\frac{1}{6}-\frac{1}{2\sqrt{3}}\epsilon+\frac{1}{6}\epsilon^2\right)f(\sqrt{3}+\epsilon). $$ For higher order you'd need to solve this numerically.

This analysis is done in the appendix of Evergreen Trees: The Likelihood Ratio Method for Binomial and Trinomial Trees, a paper forthcoming in the Journal of Derivatives.