I find many sites explaining how to use Newton's method, but none explaining why it works. Could someone give me the intuition behind it? Thanks.
2026-04-11 23:23:27.1775949807
Why does Newton's method work?
20.6k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
Related Questions in CALCULUS
- Equality of Mixed Partial Derivatives - Simple proof is Confusing
- How can I prove that $\int_0^{\frac{\pi}{2}}\frac{\ln(1+\cos(\alpha)\cos(x))}{\cos(x)}dx=\frac{1}{2}\left(\frac{\pi^2}{4}-\alpha^2\right)$?
- Proving the differentiability of the following function of two variables
- If $f ◦f$ is differentiable, then $f ◦f ◦f$ is differentiable
- Calculating the radius of convergence for $\sum _{n=1}^{\infty}\frac{\left(\sqrt{ n^2+n}-\sqrt{n^2+1}\right)^n}{n^2}z^n$
- Number of roots of the e
- What are the functions satisfying $f\left(2\sum_{i=0}^{\infty}\frac{a_i}{3^i}\right)=\sum_{i=0}^{\infty}\frac{a_i}{2^i}$
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- How to prove $\frac 10 \notin \mathbb R $
- Proving that: $||x|^{s/2}-|y|^{s/2}|\le 2|x-y|^{s/2}$
Related Questions in FUNCTIONS
- Functions - confusion regarding properties, as per example in wiki
- Composition of functions - properties
- Finding Range from Domain
- Why is surjectivity defined using $\exists$ rather than $\exists !$
- What are the functions satisfying $f\left(2\sum_{i=0}^{\infty}\frac{a_i}{3^i}\right)=\sum_{i=0}^{\infty}\frac{a_i}{2^i}$
- Lower bound of bounded functions.
- Does there exist any relationship between non-constant $N$-Exhaustible function and differentiability?
- Given a function, prove that it's injective
- Surjective function proof
- How to find image of a function
Related Questions in NUMERICAL-METHODS
- The Runge-Kutta method for a system of equations
- How to solve the exponential equation $e^{a+bx}+e^{c+dx}=1$?
- Is the calculated solution, if it exists, unique?
- Modified conjugate gradient method to minimise quadratic functional restricted to positive solutions
- Minimum of the 2-norm
- Is method of exhaustion the same as numerical integration?
- Prove that Newton's Method is invariant under invertible linear transformations
- Initial Value Problem into Euler and Runge-Kutta scheme
- What are the possible ways to write an equation in $x=\phi(x)$ form for Iteration method?
- Numerical solution for a two dimensional third order nonlinear differential equation
Related Questions in INTUITION
- How to see line bundle on $\mathbb P^1$ intuitively?
- Intuition for $\int_Cz^ndz$ for $n=-1, n\neq -1$
- Intuition on Axiom of Completeness (Lower Bounds)
- What is the point of the maximum likelihood estimator?
- Why are functions of compact support so important?
- What is it, intuitively, that makes a structure "topological"?
- geometric view of similar vs congruent matrices
- Weighted average intuition
- a long but quite interesting adding and deleting balls problem
- What does it mean, intuitively, to have a differential form on a Manifold (example inside)
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?

The method is easiest to justify in one dimension. Say that I have some complicated function $f(x)$ whose root I want to find:
"I don't know how to find its root; it's complicated!" Thus, we use a general idea that has always been used in the design of numerical methods:
One of the simplest functions one can deal with is a linear function:
$$f(x)=mx+b$$
In particular, if you want the root of a linear function, it's quite easily figured:
$$x=-\frac{b}{m}$$
Now, it is well-known (or at least, ought to be) that the tangent line of a function is the "best" linear approximation of a function in the vicinity of its point of tangency:
The first idea of the Newton-Raphson method is that, since it is easy to find the root of a linear function, we pretend that our complicated function is a line, and then find the root of a line, with the hope that the line's crossing is an excellent approximation to the root we actually need.
Mathematically, if we have the tangent line of $f(x)$ at $x=a$, where $a$ is the "starting point":
$$f(x)\approx f(a)+f^\prime(a)(x-a)=0$$
If we want $x$, then
$$x=a-\frac{f(a)}{f^\prime(a)}$$
Let's call this $x_1$.
As you can see, the blue point corresponding to the approximation is a bit far off, which brings us to the second idea of Newton-Raphson: if at first you don't succeed, try again:
As you can see, the new blue point is much nearer to the red point. Mathematically, this corresponds to finding the root of the new tangent line at $x=x_1$:
$$x_2=x_1-\frac{f(x_1)}{f^\prime(x_1)}$$
We can keep playing this game (with $x_2, x_3, \dots x_n$), up until the point that we find a value where the quantity $\dfrac{f(x_n)}{f^\prime(x_n)}$ is "tiny". We then say that we have converged to an approximation of the root. That is the essence of Newton-Raphson.
As an aside, the previous discussion should tip you on what might happen if the tangent line is nearly horizontal, which is one of the disastrous things that can happen while applying the method.