Suppose I have two 1D Gaussians distribution with N(U1,S1) and N(U2,S2) where U is the mean and S is the standard deviation. Suppose if we draw these two Gaussians, they overlap on a interval.
Now my question is that: "How can I fit a single Gaussian on the overlapped area ?"
is it possible to write a simple function to do that? I want to implement it if possible to see the outcome. so any help on this would be appreciated.
More information:
I want the final Gaussian distribution to have values only in the overlapped area and zero elsewhere.