Before I went ahead and did an implementation of polynomial linear regression, I wanted to verify that the vectorized formulas below are valid (the regularization term in specific).
$$J(\theta)=\frac{1}{2m}(X\theta-y)^T(X\theta-y)+\frac{\lambda}{2m}\sum_{j=1}^{n}\theta_j^2$$ $$\frac{\partial J(\theta)}{\partial \theta}=\frac{1}{m}X^T(X\theta-y) + \frac{\lambda}{m}\theta$$
Where obviously $J(\theta)\in \Re$ and $\frac{\partial J(\theta)}{\partial \theta}\in \Re^n$
Yes, it is correct.
$$J(\theta) = \frac1{2m}(X\theta-y)^T(X\theta-y) + \frac{\lambda}{2m}\theta^T\theta$$
$$\nabla_\theta J(\theta) = \frac{1}{m}X^T(X\theta - y) + \frac{\lambda}{m}\theta$$