Gradient descent algorithm explanation

117 Views Asked by At

How do I get from the derivative in the second last line to get xj in the last line?

Gradient Descent Algorithm

1

There are 1 best solutions below

0
On

$$ \frac{\partial}{\partial \theta_j} h_\theta(x) = \frac{\partial}{\partial \theta_j} [\theta_0 x_0 + \dots + \theta_j x_j + \dots + \theta_n x_n] = 0 + \dots + x_j + \dots + 0 $$ The underlying assumption here is that: $\frac{\partial \theta_i}{ \partial \theta_j }= 0 \ \forall i \neq j$.