Using Bayes Theorem I have solved a problem to the equation P(Dc|-) = (0.98-0.98p)/(0.98-0.93p) between the interval [0,0.1].
Show (e.g., by means of Taylor series) that in this interval the P(Dc|−) is approximated by (1 − 0.056 p). I've tried this and for some reason I keep ending up with (1-.3p) I know it's probably just a stupid mistake through applying the series incorrectly so any help would be appreciated
Because p is from [0,.1] it makes most sense to center the series at .05
$f(p)\approx f(.05)+\frac{f'(.05)(p-.05)}{1!}+$other terms
$f(p)=\frac{(0.98-0.98p)}{(0.98-0.93p)}$
$f'(p)=\frac{-.98(.98-.93p)+.93(.98-.98p)}{(0.98-0.93p)^2}$
$f(.05)=.9973$ and $f'(0)=-.05622$
$f(p)\approx .9973+-.05622(p-.05)=.9998-.0562p$
Side note: if you center it at 0, you get 1-.051p