I would like to minimize a function with one constraint.
Suppose I have two variables x and y, their product is a convergent function, so the first cost function is J1 = (xy), and other two variables z and t, their product is convergent function. so the second cost function is J2 = (zt).
I already used the gradient descent to optimize squared error between the two functions, So the new cost function was Jt = (|J1 - J2|.^2).
My question is, in my cost functions, I am always interested in optimizing the minimal result of the product, but we don't care if variable x = z. it means if x - z = 0. So, I need to add a constraint to optimize my cost function Jt with a condition of x - z = 0.
Does Barrier methods or The Lagrangian will make sense if I used them? What are some good (simple) algorithms that deal with this constrained optimization problem? I'm hoping for just a simple fix to my algorithm.
thank you so much.