In Image Restoration, a true image $f$ (in vector form) can be related to degraded data $y$ through a linear model of the form
$$y = Hf + n$$
where $H$ is a 2D blurring matrix and $n$ is a noise vector and it's required to get $f$ from knowing $y$.
It is necessary to rely on a regularization to stabilize the inversion of ill-posed problem. Through the regularization, the problem is replaced by the one of seeking an estimate $f$ to minimize the Lagrangian:
$$\min_f ||y-Hf||^2_2 + \alpha||Cf||^2_2$$
Where $C$ is a matrix represents a high pass filter, I have read that there are ways to automatically determine the optimum value for the lagrange multiplier $\alpha$ but I didn't understand any thing I'm not a mathematics geek.
Could you explain the way to choose the optimum $\alpha$? Are there any simple tutorials? What are the most powerful algorithms to choose $\alpha$?
thanks,
If one assume the matrix $ C $ is the derivative matrix (Finite Differences) then the model above is the MAP where the prior for image derivatives is a Normal Distribution.
In that case one could easily connect the parameter of $ \alpha $ to the ratio between the variance of the noise in the image and the variance of the derivative distribution.