How do I minimise a function $f:\mathbb{R}^n\!\times\mathbb{R}^n\!\rightarrow\!\mathbb{R}$ iteratively?

34 Views Asked by At

I have a non-linear function $f(\mathbf{x}, \mathbf{y})$ where $\mathbf{x},\mathbf{y}\!\in\!\mathbb{R}^n$. How do I minimise $f$ using an iterative approach like steepest descent, conjugate gradient, Newton's method, etc.? Do I 'stack' $\mathbf{y}$ under $\mathbf{x}$, treat the result as a single unknown vector, and then compute gradients/Hessians from this new single vector with $2n$ unknowns? Additionally, what is the most efficient/appropriate non-linear optimisation procedure that I can use, if I know $f$ and all its derivatives analytically?