Turning a summation into an integral

66 Views Asked by At

I have a summation of the form:

$$y(x) = \sum\limits_{h=-L}^L\frac{A(h)\cdot R(h)^2}{((x-h)^2+R(h)^2)^{3/2}}$$

Where I wish to solve/optimise $R(h)$ (leaving $A(h) = const/h$) or $R(h)$ and $A(h)$ such that: $y(x) = mx + c$ for a range of $x$

Is there any way to analytically solve this? Perhaps by using orthogonal functions and decomposing the equation and then summing the coefficients?

For numeric solving the numbers are:

$L = 0.05$

$h$ = 101 values

$A(h)$ would ideally be constant so would be $(const/h)$

The range of validity for $x$ (where $y \approx mx+c$) would ideally be between 0.01 and 0.04

What would be the best way to code this for a numeric solution?

Thanks in advance for any help!