I am struggling to wrap my head around the math of a paper I need to understand for my work. In essence, it has to do with super-resolution microscopy, and determining the expected value of a pixel in a detector given a sub-pixel source of light that is scattered according to a (simplified) 2D gaussian distribution around the "true" source emitter.
Specifically, the part I am struggling with understanding the steps between the expression as an integral over the (unit square) area centered on (x,y) (Eq.2), and the error function/erf expression equivalent it is converted to (Eq.3/4a/4b). Attached is the relevant area of the paper supplemental.
Any and all help, or pointers on where to look or what to try would be greatly appreciated!
Partial page containing relevant information
Equations:
Eq.1: PSF(x,y) = (1/(2 π σ^2)) e^(-(x - θ_x)^2 - (y - θ_y)^2)/(2σ^2)
... where θ_x,y is the position of the emitter
Eq.2: μ_k(x,y) = θ_I0 * INTEGRAL over A_k [PSF(u,v) du dv] + θ_bg
...where μ_k(x,y) denotes expected value in the kth pixel, θ_I0 is expected photon count, θ_bg is expected background count, and the integral is over the finite area A_k of the kth pixel centered on (x,y)
Eq.3: μ_k(x,y) = θ_I0 * ΔE_x(x,y) * ΔE_y(x,y) + θ_bg
Eq.4.1: ΔE_x(x,y) ≡ 1/2 erf((x - θ_x + 1/2)/(2 σ^2)) - 1/2 erf((x - θ_x + 1/2)/(2 σ^2)) Eq.4.2: ΔE_y(x,y) ≡ 1/2 erf((y - θ_y + 1/2)/(2 σ^2)) - 1/2 erf((y - θ_y + 1/2)/(2 σ^2))
The Gaussian standard deviation, σ, is experimentally determined.