The Significance of Linear Approximation

6.1k Views Asked by At

I want to know what makes linear approximation so important (or useful). What I am aware of in my current state of limited understanding is that linear approximation is one of the applications of a derivative and that it is used to approximate the value of a function at a point.

Please forgive my naivete. Here I go.

Linear approximation for a function f(x) is given by

$$f(x) \approx f(x_0) + f'(x_0)(x-x_0)$$

For example, the function near x = 0

$$ln(1+x) \approx x$$

Using the definition of linear approximation above, the value of the function at x = 0 is equal to 0.

I hope I don't sound really stupid, but I can just plug in the value x = 0 into the original function ln (1 + x) and get the same answer without even having to know what the linear approximation is (well, that's just what ln(1 + x) ≈ x means).

But if one can just evaluate the value of a function at a point and get an answer that's more or less the same with the answer found by using linear approximation, is it even necessary to know what the linear approximation is?

I can see that linear approximation can be used to simplify a complicated function into a tremendously simple one. For example, the function g(x) is given by the equation

$$ g(x) = \frac{e^{-3x}}{\sqrt{1+x}} $$

and its linear approximation near x = 0 is

$$ g(x) = \frac{e^{-3x}}{\sqrt{1+x}} \approx 1 - \frac72x $$

The linear approximation looks tremendously simple as compared to the ugly-looking g(x).

  • Besides simplification, are there other applications of linear approximation? I've read about some applications on Wikipedia, but I would like to hear from the users.
  • Does using linear approximation on a computer program make computations more efficient?
  • Can the same thing be said for quadratic approximations, too?

Thank you so much for answering!

4

There are 4 best solutions below

1
On BEST ANSWER

It very often happens in applications that a model produces equations that are extremely difficult or impossible to solve. However, some of the factors are more important than others. There is often a parameter, let's say $p$, whose values are typically small, corresponding to one of these less important factors. If you set $p$ to $0$, thus ignoring that factor completely, it simplifies the situation so much that the solution becomes easy. Thus if you're looking for $F(p)$, you compute $F(0)$ instead.

But you don't want to ignore the factor completely, so the next thing to try is a linear approximation. Even though you can't compute $F(p)$ when $p \ne 0$, you may be able to find $F'(0)$, and thus you can use the linear approximation $F(0) + p F'(0)$, which should produce a good approximation to $F(p)$ when $p$ is small.

If you want even better approximations, you can try quadratic and higher-order approximations.

1
On

in many application, you can consider an error of order 0.0001 as zero

for example in my field, communication, depending on application it vaires from 0.0001 to 10^-12. thus, when we can simplify the complex equation and have same efficiency why not ? on the other hand, think a machine with limited resources must calculate the of interests answers, by approximation we can reduce the resource consumption significantly while maintaining efficiency

1
On

Most "continuous"* problems to be solved by a computer are solved by some sort of approximation procedure. Some such problems include evaluating a function like $e^x$, solving an algebraic equation like $x^5-x-1=0$, calculating an integral, and solving differential equations. Linear approximation is basically the simplest kind of approximation other than constant approximation.

Let me expand upon this example of solving algebraic equations. When we do it with linear approximation, the result is called Newton's method. Here you have an algebraic equation in the form $f(x)=0$ and you have a guess for the solution $x_0$. Newton's method then replaces your guess with the solution to the equation $g(x)=0$, where $g(x)$ is the linear approximation of $f$ at $x_0$, i.e. $g(x)=f(x_0)+f'(x_0)(x-x_0)$. The equation $g(x)=0$ is trivial to solve. Now you have a guess $x_1$, and you repeat the procedure. This works very well provided the initial guess is good. Try it out with the problem of finding $\sqrt{2}$, considered as the root of $x^2-2$, with the initial guess $x_0=1$.

There are situations where we have a problem with a small parameter, and we want to approximate the solution to the problem in terms of the solution to the problem where the parameter is zero, plus correction terms. Sometimes this can be done exactly as stated; such problems are called regular perturbation problems. (For example, the problem of solving $x^2-\varepsilon=0$ for small $\varepsilon>0$ is a regular perturbation problem.) Other times there is no solution to the problem with the parameter being zero; these problems are called singular perturbation problems. (For example, the problem of solving $\varepsilon x^2-1=0$ for small $\varepsilon>0$ is a singular perturbation problem.) Both of these come up quite frequently in applications.

There are also truly "analytic" applications, where iterated approximation allows us to demonstrate something exact, such as the existence of a solution to a problem. Real examples of this are probably above your current level, so I won't comment further unless requested.

*As opposed to "discrete"; I won't try to fully clarify this distinction, because it's not productive here. I just didn't want to say something blatantly false.

0
On

I'll give a classic example in physics. A pendulum. A simple swinging pendulum.

The differential equation that describes the angle, $\theta$ a pendulum is from center is:

$$\ddot{\theta}(t)+k\sin(\theta(t))=0$$

This differential equation is NOT solvable exactly. However you may make the small angle approximation $\sin(\theta)\approx \theta$ to get: $$\ddot{\theta}(t)+k\theta(t)=0$$

This is very solvable.

If the equation for something as simple as a pendulum can't be solved exactly, imagine anything that is remotely complicated.