Suppose that $X\sim \mathrm{Bin}(n,p)$. Then $E[X]=np$.
I know that, by the Law of Large Numbers, if a have a sample $X_1$ $,...,X_n $, then $\displaystyle\lim_{x\to\infty} P(|\overline{X} - p| < \epsilon) = 1, \forall \epsilon > 0$.
So in a sense, we can say that for large enough $n$, $X\approx np$ (right?)
My question is: suppose that I have a function of $X$, say $f(X, \theta)$, in which $\theta$ is not stochastic. Suppose that it is too complicated for me to compute $E[f(X, \theta)]$. Can I approximate this function by computing $f(np, \theta)$? That is, can I say that $\displaystyle\lim_{x\to\infty}$ $P(|E[f(X, \theta] - f(np, \theta)| < \epsilon) = 1, \forall \epsilon > 0$? My intuition tells me that yes, but I'm not sure this is correct.
If you mean $X \approx np$ where $X \sim \text{Bin}(n,p)$ then that is not true. I think you meant $\overline{X}$, so that $\overline{X} \to np$ as $n \to \infty$ (by the SLLN). So if you had a continuous function $f(\overline{X}, \theta)$, then for large $n$, this would converge to $f(np, \theta)$ by the continuous mapping theorem (this theorem also works for convergence in probability).