In this course on machine Learning by Stanford, Andrew Ng describes a function for a line as h(x) = Theta subscript 0 + Theta subscript 1 x instead of the more common slope-intercept form y = mx + c.
Thanks to a previous question, I now understand how both forms represent a line (via the y intercept and slope) but am unclear why Andrew Ng decided to use the Theta form of the equation rather than the more common slope-intercept form.
Can anyone help me understand this please?