I have to use the Composite Simpson's Rule to approximate the integral $\int_0^1 t^2\cdot sin(\frac{1}{t}) dt$. I've used the Composite Simpson's Rule, but when I work through the algorithm, one step is throwing me off.
When I try to compute $XI0 = f(a) + f(b)$, $f(a)$ is $f(0)$, which is undefined ($0\cdot sin(\frac{1}{0})$). How am I supposed to work through this? All the other steps work fine.
For reference, here is the Algorithm listed in my textbook, Numerical Analysis Ninth Edition, by Richard L. Burden and J. Douglas Faires:
To approximate the integral $I = \int_a^b f(x) dx$
- INPUT endpoints $a$, $b$; even positive integer $n$
- OUTPUT approximation $XI$ to $I$
- Step 1 Set $h = \frac{b-a}{n}$
- Step 2 Set $XI0 = f(a) + f(b)$; $XI1 = 0$; $XI2 = 0$
- Step 3 For $i = 1,...,n-1$ do Steps 4 and 5
- Step 4 Set $X = a + ih$
- Step 5 If $i$ is even, set $XI2 = XI2 + f(X)$; else set $XI1 = XI1 + f(X)$
- Step 6 Set $XI = h(XI0 + 2\cdot XI2 + 4\cdot XI1)/3$
- Step 7 OUTPUT(XI); STOP
You can figure out what the value 'should' be here to get the algorithm to converge optimally. It's best when the function is continuous, so we ask what the limit of $t^2 \sin \frac{1}{t}$ is as $t\to 0$. Then it smoothly joins on.
I'll leave this as an exercise, but if you want a hint...
Hint: $|\sin u| \le 1$ for all $u$.