In the context of a numerical simulation I have normally distributed random variables $A$ and $C$. These variables are based on real data. I know that there is a distribution $B$ where $A + B = C$.
Let $A = N($$\mu_a$,$\sigma_a$)
Let $C = N($$\mu_c$,$\sigma_c$)
$\sigma_c > \sigma_a$
If I only know $C$ and $A$, how could I go about calculating $B$ using some numerical method?
If all you know are the distributions of $A$ and $C$, you can't determine the distribution of $B$ without some other assumptions. For example, if $A$ and $B$ are assumed independent, then $B \sim N(\mu_c - \mu_b, \sqrt{\sigma_c^2 - \sigma_a^2})$. But if $A$ and $C$ are assumed independent, then $B \sim N(\mu_c - \mu_b, \sqrt{\sigma_c^2 + \sigma_a^2})$. And there are also examples where $B$ does not have a normal distribution at all (see e.g. here).