My previous question on SVD gained very little traction, so I thought I'd try a different version that hopefully has an explicit solution. As noted in the linked question, I am taking a function of two variables $f(x,y)$ and approximating the series of curves $f(x,y_0), f(x, y_0 + \delta), f(x,y_0 + 2\delta), \ldots$ for the domain $x \in [x_0,x_1]$, $y \in [y_0,y_1]$. From these curves I compute the singular value decomposition. In the previous question, these singular values were exponentially separated from each other. When $f$ is a sum like
$$ f(x,y) = h(x) + g(y) $$
There seems to be (within machine precision) only two non-zero singular values, no matter what the functions $h$ and $g$ are (I've used random 4th order polynomials below):

This means that there are four functions ($u_1, u_2, v_1,v_2$) such that
$$ f(x,y) = \lambda_1 u_1(x) v_1(x) + \lambda_2 u_2(x) v_2(x) $$
Clearly $u_1(x) = f(x), v_2(y)=g(y), u_2=v_1=1$ is a solution but not the one obtained by the SVD. Intuitively, the SVD seeks projections that "explain" as much of the data as possible. This sounds like a variational problem to me.
Question
Can this specific problem be formulated as a calculus of variations problem? Something like
$$ \int_{x_0}^{x_1} \int_{y_0}^{y_1} F[x,y,u_1, u_2, v_1, v_2] dx dy $$
such that the functions $u_i$, $v_i$ are the same as those obtained from the SVD when the stationary values of $F$ are found?