Let $X$ be a random variable and let $A$ be an event. Is there a conditional law of large numbers to estimate $E[X|A]$?
My idea was the following: write $E[X|A] = \dfrac{E[X*1_{A}]}{\mathbb{P}(A)}$ and use a random sample of $X*1_{A}$ to approximate $E[X*1_{A}]$ using the law of large numbers. However, I do not know how to formally construct such a sample. Even if I consider random samples $\{X_i\}_{i \in \mathbb{N}}$ and $\{1^{i}_A\}_{i \in \mathbb{N}}$, the sequence $\{X_{i}*1^{i}_{A}\}_{i \in \mathbb{N}}$ does not need to be independent.
Intuitively, a good approximation should be the sum of the random variables $\{X_{i}*1^{i}_{A}\}_{i \in \mathbb{N}}$ divided by the number of times that the event $A$ has occurred, but I do not know how to prove it formally.
All you need to do is to go from initial sample space $S$ to a new sample space $A$ and use law of large numbers in your new sample space. What I mean by this is $E[X]$ is basically $E[X|S]$, so if we change $S$ to $A$ it wouldn't be very different. So let's try and calculate $E[X|A]$.
To do this we can introduce a new r.v. $Y \sim Bernulli(p)$ so that $Y = 1$ if we see event A introduce f(x,y) a p.f. of mixed bivariate distribution of $X$ and $Y$: $$ P(X\in B, Y\in C) = \int\limits_B\sum\limits_{y\in C}f(x,y)dx. $$
Basicaly from here we can calculate $g(x|y)$ as
$$ g(x|y) = \frac{d}{dx}P(X\le x|Y=y) = \frac{d}{dx}\frac{P(X\le x, Y=y)}{P(Y=y)} = \frac{d}{dx} \frac{\int\limits_{-\infty}^xf(x,y)dx}{P(Y=y)} = \frac{f(x,y)}{P(Y=y)}. $$
And now we're only left with calculation of $E[X|A]$ or $E[X|Y=1]$:
$$ E[X|Y=1] = \int\limits_{-\infty}^{\infty} xg(x|1)dx. $$
As you can see from the formula above we have the same formula (with the same properties) for expectation calculation as for calculation of $E[X]$ the only thing different is p.d.f. You can think of $g(x|1)$ as a new p.d.f in your sample space $A$.
So if you have a sample of $\{ (x_1, y_1), (x_2, y_2), ..., (x_n, y_n)\}$. And you want to get an approximation of $E[X|Y=1]$ using LLN, you need