Say we've got $A\in\mathbb{R}^{N\times N}$, and $A=U\textrm{diag}(\sigma_1,\sigma_2,\ldots,\sigma_N) V^\top$ is its singular value decomposition. For a function $f\colon\mathbb{R}_{+}\to\mathbb{R}_{+}$, I am curious if there is a method for efficiently evaluating
$$\hat{A}=U\textrm{diag}(f(\sigma_1),\sigma_2,\ldots,\sigma_n)V^\top,$$ without having to compute the full SVD of $A$ (which can be quite expensive).
My motivation in thinking this might be possible is because for eigenvalues, one can compute the largest eigenvalue-eigenvector pair much faster (numerically speaking) than performing a full eigendecomposition. However, I feel like despite this, the answer is probably still no?
I have not tried implementing this, but it is at least in theory possible to find the top singular vectors (left and right) by power method (just google "SVD power method" and such), and some others -- see here to start and follow the links.
Of course, once you know the top singular vectors and the top singular value it solves the problem in your question.
Indeed, $A=UDV^T$ means precisely that $A=\sum_i \sigma_i u_iv_i^T$ ($A$ takes the unit vector $v_i$ to $\sigma_i u_i$). To replace $\sigma_1$ by $f(\sigma_1)$ just add $(f(\sigma_1)-\sigma_1 )u_1v_1^T$ to $A$.