Fixed-point iteration convergence proof

500 Views Asked by At

I need to check if this equation: $g(x)=x$ for $g(x)=\frac{e^x}{1+e^x}$ is a solution with fixed-point iteration in $[0,1]$.
So it's mean that initial equation is: $f(x)=\frac{e^x}{1+e^x}-x$ that we look for roots of it in $[0,1]$.
I make follow steps $g'(x)=\frac{e^x}{(e^x+1)^2}$, and $g''(x)=\frac{-e^x(e^x-1)}{(e^x+1)^3}$
$g''(x)$ monoton decrise in $[0,1]$ so it mean that we don't have extrem points for $g'(x)$ in $[0,1]$. We can only check it's on the rand.
$g'(0)=1/4$ and $g'(1)=0.19$, so both poits on the rand less than $1$ so our interation converge to solution in $[0,1]$.
In deed $f(0)=0.5$ and $f(1)=-2.68$ so our root muss be in $[0.1]$

2

There are 2 best solutions below

5
On

Well yes. But it is actually easier. For all $x \in [0, 1]$: $$ \lvert g'(x) \rvert = \frac{e^x}{(1+e^x)^2} \leq \frac{e^x+1}{(1+e^x)^2} = \frac{1}{1+e^x} \leq \frac{1}{2} $$ Then we observe that for $x \in [0, 1]$ we have $$ 0 \leq g(x) = \frac{e^x}{1+e^x} \leq \frac{1+e^x}{1+e^x} = 1 $$ which means that $g$ maps $[0, 1]$ onto itself. Consequently, the fixed point iteration converges for every initial value in $[0, 1]$.

1
On

Let $(X,d)$ is a complete metric space, and $f : X \to X$ a function with the following property:

There exists $\alpha \in [0,1)$ such that $d(f(u),f(v)) \leq \alpha d(u,v)$ for any $u,v \in X$.

Then the Banach fixed-point theorem says that there exists a unique $x^* \in X$ such that $f(x^*) = x^*$. Moreover, if $x_0$ is any point in $X$, then for any integer $n \geq 1$ we have that $$d \Big(x^*, \underbrace{f(\cdots(f}_{n \textrm{ times}}(x_0)) \cdots) \Big) \leq \frac{\alpha^n}{1-\alpha}d(f(x_0),x_0). \tag{$*$}$$

Now, we know that $([0,1],d)$ (where $d(a,b) := |a-b|$ for $a,b \in [0,1]$) is a complete metric space, and that $g : [0,1] \to [0,1]$ (note that $\frac{e^x}{1+e^x} \in [0,1]$ for any $x \in [0,1]$) is a function with the above quoted property: if $$\alpha := \max_{x \in [0,1]} |g'(x)|$$ then $\alpha \in [0,1)$ (as @Meowdog noted) and $|g(u)-g(v)| \leq \alpha |u-v|$ for any $u,v \in [0,1]$ (for example, if $u<v$, by the mean value theorem there is a $c \in (u,v)$ such that $g(u)-g(v) = g'(c)(u-v)$, and then $|g(u)-g(v)| = |g'(c)||u-v| \leq \alpha |u-v|$).

Thus, by the Banach fixed-point theorem, we know that the equation $g(x^*)=x^*$ has a unique solution in $[0,1]$, and the inequality $(*)$ helps us to approximate it. For example, if we take $x_0=0$, then the distance between $$g(g(x_0)) = g(\tfrac12) = \frac{\sqrt e}{1+\sqrt e}$$ and the true solution $x^*$ is at most $$\frac{\alpha^2}{1-\alpha} |g(x_0)-x_0| = \frac{(\tfrac14)^2}{1-\tfrac14} \cdot \frac12 = \frac1{24} \approx 0.04167$$ (is easy to see that $\alpha = |g'(0)| = \frac14$).