The question is from Question 4 from the following worksheet: http://cs229.stanford.edu/ps/ps1/ps1.pdf
It is a variant of this question: Prove that Newton's Method is invariant under invertible linear transformations
Newton's method for optimization can be expressed with the goal of minimising:
$$x= x- \frac{f(x)}{f'(x)}$$
or maximising:
$$x= x- \frac{f'(x)}{f''(x)}.$$
I am trying to show that Newton's method is invariant under linear reparameterisations, and I have managed to show it for maximising, but I am struggling to explicitly show it for minimising. (I'm sure it's acceptable to infer one from the other, but I don't want to do that.)
Let $x$ be some vector and $A\in\Re^{n\times n}$, and $g(z)=f(Az)$. We want to show that each iteration $i$ satisfies $z^{(i)} = A^{-1}x^{(i)}$.
My attempt
\begin{equation} \begin{split} x^{(i+1)} & = x^{(i)} - (\nabla g(x))^{-1}g(x) \\ & = x^{(i)} - (A^T\nabla f(Ax))^{-1}f(Ax) \\ & = x^{(i)} - (\nabla f(Ax))^{-1}(A^T)^{-1}f(Ax) \\ A^Tx^{(i+1)} & = A^Tx^{(i)} - (\nabla f(Ax))^{-1}f(Ax) \end{split} \end{equation}
This is the point where I would hope to be concluding that Newton's method is linearly invariant, but I have $A^T$ where I want $A^{-1}$.
What am I doing wrong here?
I'm new to linear algebra so I'm probably making some illegal operational.