I have some binned data (in this case, particle decays) that I need to apply a roughly gaussian fit to. I believe, The binned data shows how many events are registered in a given part of the detector, hence why you'd expect the gaussian, with a centre roughly where the decays are happening.
I have applied a least squares fit to this data (there are many runs of data) and almost all of them come out with very high reduced chi^2 values. I am struggling to fix this problem; I've had multiple suggestions and would like some input:
Add 8 histogram bins together to make a single larger bin, increasing statistics per bin at the loss of precision and information
Add whole histograms together. Much the same result though I guess, but losing time precision too since each histogram is taken 15 seconds or so after each other
Just pick out the histograms that the model fits well onto and work with those, Change outlier conditions to reflect these histograms. Seems like cherry picking data to me though.
Apparently the least squares fit cannot be validly used here? I didn't really understand why, something along the lines of it doesn't work when histograms have many bins with 0 or low number of events and that poisson distributions would work better. I don't know, I was unclear on this one.
Example fitting:
From this data set

How do I best go about applying a model to analyse this data that results in a decent fit (chi^2)? Thanks for any help