Partial derivative of y = wx + b

1.9k Views Asked by At

I am reading from a book about neural network that first mentions a simple activation function of the following form.

$$ y = wx + b $$

$y$, $w$, $x$ and $b$ are all scalar real numbers.

Then it explains how a small change in $w$ and $b$ is going to affect the output $y$ with the following relationship.

$$ \Delta y = \frac{\delta y}{\delta w} \Delta w + \frac{\delta y}{\delta b} \Delta b $$

Could you please provide me a step by step proof of how this relationship follows from the first equation $y = wx + b$?

1

There are 1 best solutions below

2
On BEST ANSWER

For a function, let us say $y$ that depends on $x$, $w$ and $b$, $i.e$ $y=y(x,w,b)$ the change of $y$ fixed $x$ due to a variation of $w$ and $b$ is defined infinetesimaly as: $$dy = \frac{\partial y}{\partial w}dw+\frac{\partial y}{\partial b}db$$