If one should attempt to factorize a number like the RSA-2048, or in general any number with $n$ decimal digits, using the best algorithm available and a modern desktop PC, what is the approximate length in time it would take (as a function of $n$) ? I'd like a general formula (possible parameterized with CPU speed and/or # of cpu's) so I can apply it to other numbers and PCs.
Thanks
The number field sieve algorithm is generally the most efficient factoring algorithm and has a running time of $$O(\exp(c \sqrt[3]{(\log N)(\log \log N)^2}))$$ but I don't think it feasible or meaningful to derive a formula for the absolute time of computation.