I have a specific problem I'm working on. Let $X$ be an exponential random variable, and let $Y$ be a random variable defined by:
$$ Y = \begin{cases} 0 & \text{ if } X < d \\ (X - d) & \text{ if } X > d \end{cases} $$
So that $Y$ is $X$ "with a deductible $d$". We are told that $E(Y) = 0.9 E(X)$, and by applying the law of total expectation and memoryless-ness, we can conclude that $P(X>d) = 0.9$.
I've run into a few problems in computing $E(Y^2)$, and I've noticed a possible theorem that would make it all work very neatly. In general, is it true that for a memoryless random variable $X$ and integrable function $g$,
$$ E(g(X - d) \mid X > d) = E(g(X)) ? $$
$\newcommand{\E}{\mathbb E}$ $$ \E(g(X - d) \mid X > d) = \E(g(X))\text{ ?} $$
For any measurable set $A\subseteq\mathbb R^+$, $$ \Pr(X-d\in A\mid X>d) = \Pr(X\in A). $$ In other words, the conditional probability distribution of $X-d$ given that $X>d$ is the same as the marginal (or "unconditional") probability distribution of $X$.
If two random variables $X$ and $W$ on two different probability spaces have the same distribution, then for any function $g$ for which the expectations are defined, $\E(g(X))=\E(g(W))$.
If $X$ is defined on the space $\Omega$ with measure $P$ then $\{\omega\in\Omega\mid X(\omega)>d\}$ with measure $\dfrac{P}{P(X>d)}$ is a probability space in its own right.
So the answer is affirmative.
Notice that things like $\Pr(X>d)$ and $\E(X)$ depend only on the probabilty distribution of $X$ and not on the underlying space $\Omega$. So if two probability distributions of real random variables are the same, then they have the same expectations, quantiles, etc.