Currently I am studying logistic regression.
I read online that Gradient Descent is a 1st-order optimisation algorithm, and that Newton's Method is a 2nd order optimisation algorithm.
Does that mean that Gradient Descent cannot be used for multivariate optimisation and that Newton's Method cannot be used for univariate optimisation? Or can Newton's Method be done using a 1st order Taylor polynomial and still be different from Gradient Descent?
These sites are causing me to question:
- Univariate Newton's Method http://fourier.eng.hmc.edu/e176/lectures/NM/node20.html
- Multivariate Gradient Descent http://fourier.eng.hmc.edu/e176/lectures/NM/node27.html




Like in the comments stated; gradient descent and Newton's method are optimization methods, independently if its univariate or multivariate. Gradient descent only uses the first derivative, which sometimes makes it less efficient in multidimensional problems because Newton's method attracts to saddle points. Newton's method uses the curvature of the function (the second derivative) which lead generally faster to a solution if the second derivative is easy to compute. So they can both be used for multivariate and univariate optimization, but the performance will generally not be similar.