I'm taking a course in Machine Learning where the Gradient Descent algorithm is being used for optimization. I'm in high school and I have a decent knowledge of both Differentiation/Partial Differentiation and Integration.I haven't ever computed the partial derivative of a summation function. Is there any resource?/What is the intuition? behind computing the derivative as shown?

2026-04-03 12:41:14.1775220074
How do I analyze the partial derivative of the following summation?
388 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
$\frac{\partial}{\partial\theta_j}$ are linear operators, so you can distribute them over the terms in a sum. Writing $\partial_j$ for the sake of simplifying notation, you have that for a finite sum, $\partial_j\sum_i c_if_i =\sum_i c_i\partial_j f_i$, just like you have with ordinary derivatives.
Then you're just using the chain rule; remember, for the operator $\partial_j$, only $\theta_j$ is regarded as a variable, and all other variables are temporarily regarded as constants. So the chain rule gives us $$\partial_0\left((\boxed{\theta_0} +\theta_1 x^{(i)}-y^{(i)})^2\right) = 2(\boxed{\theta_0} +\theta_1 x^{(i)}-y^{(i)})^1\cdot\partial_0 (\boxed{\theta_0} +\theta_1 x^{(i)}-y^{(i)}) $$ and $$\partial_1\left((\theta_0 +\boxed{\theta_1} x^{(i)}-y^{(i)})^2\right) = 2(\theta_0 +\boxed{\theta_1} x^{(i)}-y^{(i)})^1\cdot\partial_1 (\theta_0 +\boxed{\theta_1} x^{(i)}-y^{(i)})$$ The factors on the right end of these expressions simplify as $$ \partial_0 (\boxed{\theta_0} +\theta_1 x^{(i)}-y^{(i)}) =1 $$ and $$ \partial_1 (\theta_0 +\boxed{\theta_1} x^{(i)}-y^{(i)}) = x^{(i)} $$ giving you $$ \partial_0\left((\theta_0 +\theta_1 x^{(i)}-y^{(i)})^2\right) = 2(\theta_0 +\theta_1 x^{(i)}-y^{(i)}) $$ and $$ \partial_1\left((\theta_0 +\theta_1 x^{(i)}-y^{(i)})^2\right) = 2x^{(i)}(\theta_0 +\theta_1 x^{(i)}-y^{(i)}) $$ Now you should be able to put the sum back together to get the desired result.