Interpreting the meaning of change in a variable

217 Views Asked by At

If $z$ is a function depending on some other variables, let's say $x$ and $y$, I have learnt that we can write

$$\delta z=\frac{\partial{z}}{\partial{x}}\delta{x}+\frac{\partial{z}}{\partial{y}}\delta{y}$$

Or if a function simply depends on a single variable, for example if $y$ depends on $x$, then we can write $$\delta y=\frac{\mathrm{d}y}{\mathrm{d}x}\delta x$$

I want to know why it can be written like this. I know that $\delta y$ represents some finite change in $y$ and $\mathrm{d}y$ also represents infinitesimal change in $y$, but how are they related is confusing me.

Any help would be appreciated!

1

There are 1 best solutions below

17
On BEST ANSWER

This is because a change in a variable effects in a change in another variable dependent upon that variable.

Think of a pattern, for example, $y = 2x$. Here,

$$\frac{dy}{dx} = 2\space(\text{the slope of the line}\space y = 2x) \longrightarrow(1)$$

This will have a slope at $x = a$ ($a$ is some other value) Now, this $a$ changes to $a'$ . What could be the slope of the line there ?

So, here, $$\delta{x} = a' - a \longrightarrow(2)$$ $$\delta{y} = 2a' - 2a = 2(a' - a) \longrightarrow(3)$$ ($\delta$ here is 'change')
From $(1)$, we know, $\frac{dy}{dx} = 2$ and from $(2)$, we know $a' - a = \delta{x}$.

Putting these in $(3)$, $$(3) = \delta{y} = \frac{dy}{dx}\delta{x}$$

Thus, we get to understand what change occurs in $y$ due to a change in $x$ . But that particular method is used in approximations and errors - there, $\delta{y} \approx dy$ and that's why that can be approximated like that. Also, $$\frac{\delta{y}}{\delta{x}} \approx \frac{dy}{dx}, \text{not always} \frac{\delta{y}}{\delta{x}} = \frac{dy}{dx}.$$

Suppose that in another scenario, we have $y = 0x$. Here, there will be no change in $y$ even if $x$ goes up or down. Even this can be explained with the above relation.

Now, let's think of this geometrically. Consider the graph of a straight line going through the origin. You can take a pair of points on the line. (Better suggested to take the first point as the origin.) Now taking the $x$ and $y$ coordinates of each point, take their differences. That is, let : $$\delta{y} = y' - y$$ $$\delta{x} = x' - x$$ The line's slope is : $$\tan\theta = \frac{y' - y}{x' - x}$$ also, $$\tan\theta = \frac{dy}{dx}$$ $$\implies \frac{dy}{dx} = \frac{y' -y}{x' - x} = \frac{\delta{y}}{\delta{x}}$$

This seems more like a not-so-rigorous approach to it, but this is how I found it easier to explain.

In the case of functions in many variables, we actually take the derivative with respect to one variable, keeping the others constant (even if they tend to vary). This is an approach scientists commonly use in their experiments - they keep one condition or the other constant and change the others. This same idea can be seen in the back propagation algorithm used to train multi-layer perceptrons, where the required change in the synaptic weights is a multiple of the derivative of the error with respect to the weight to be changed (since that weight contributed to the output of the model).

If you find this answer less helpful, check here :Quora - How can we say that $\delta{y} = \delta{x} \times \frac{dy}{dx}$

I had to take a linear polynomial (or a simple function , I guess) to explain this. Hope this works.