How do these definitions of regularization match?

29 Views Asked by At

I know regularization from the following point of view:

You add a term R(f) to a loss function, e.g:

$min_{f} \sum V(f(x_i),y_i) + \lambda R(f)$ where $\lambda$ is a parameter.

I recently read about thinking of regularization as replacing the inverse of an operator T (if you want to solve Tx = y) by a family C_h of approximate inverses of T s.t.: $C_h T \rightarrow I$ where $I$ is the identity matrix and $h \rightarrow 0$ in a suitable sense.

Both definitions make sense to me, but I can't see how one could proof they are equivalent? They must mean the same I think. Why?