I am looking to solve the following equation, where $A$ is a diagonal matrix:
$$\min_x\ (Lx - f)^T A (Lx - f)$$
which I recognize to be similar to least squares, but then with respect to a scaling $A$.
My derivation so far goes as follows:
$$ (Lx - f)^TA(Lx - f) = (Lx)^TALx - (Lx)^TAf - f^TALx + fAf $$ $$ = (Lx)^TALx - 2(Lx)^TAf + fAf $$
This I recognize as a regular quadratic form of $\frac{1}{2}(Lx)^TALx - (Lx^T)b + c$, which w.r.t. $Lx$ has a minimum at $ALx = Af$.
I am not sure, however, if this means this minimum also holds for x. I also know that L is a symmetric positive definite matrix, if that helps.
Does the solution for $Lx$ also hold for $x$?
I assume you are minimizing this equation over $\mathbb{R}^n$. Since $L$ is symmetric positive definite, it is invertible, and hence this minimization problem may be reformulated as
$\displaystyle\min_y~~y^T A y$,
where $y = L x - f$. There are a few cases to consider. Suppose $A$ has at least one negative diagonal element. Then $y^T A y$ can be made arbitrarily negative, and hence a minimum value does not exist. Now suppose that $A$ has nonnegative diagonal elements. Then $y^T A y = \|\sqrt{A}\,y\|_2^2 \geq 0$ for all $y$. There are two cases to consider. If $A$ is singular (i.e., it has at least one zero diagonal element), then there exist infinitely many vectors $y$ for which $y^T A y$ achieves its minimum value of zero. If, on the other hand, $A$ is nonsingular, then the minimizer is unique and is given by $x = L^{-1}f$.