...where a "distinct algorithm" is approximately defined as an algorithm that returns a value distinct from all others thus far.
I would think not, because you can always construct some pathological language that requires ever-increasing arbitrary strings of inputs to properly encode a useful command, and otherwise treats it all as a NOOP/ignores it.
On the other hand, I know there's that great property to K-complexity which provides a firm constant as an upper bound for how much longer the same algorithm can be when compared between any two languages.
These two outlooks are in conflict, so I know I'm overlooking something here. What is it?