How to implement a 2D matrix version of wolfe line search?

139 Views Asked by At

In Wolfe line search, the product of a gradient $g$ and the descend direction $g$ must be calculated. In 1D, both gradient and direction are vectors. Consequently, the product $ g^Td $ is a scalar. The objective function value is also a scalar number. The condition $f(x+\alpha d)<f(x)+\alpha cg^Td$ make sense. But in 2D situation, for example, an objective function using Frobenius norm $$ F(X) = \frac{1}{2}||AXB^T-C ||^2_F $$ has a 2D gradient $G(X)=A^T(AX-B)B$ and a 2D descend direction $D(X)=-G(X)/||G(X)||$. When using line search, $G(X)^TD(X)$ would produce a 2D matrix. While $ F(X+\alpha D)$ and $ F(X) $ are scalar. How to use the Wolfe condition in this situation? Are there any alterlatives?