"Encode" a single digit in a multi-digit number in the smallest way possible

1k Views Asked by At

This may have more to do with computing than Mathematics in its application, however this has been giving me a headache for some time and I see no other recourse than to ask...

Given a natural integer T > 9 (and probably between 1,000 and 1,000,000,000), and a choice N in {0,1,2,3,4}, provide two algorithms (encoding and decoding):

  • One which take T and N as inputs and outputs R, a number which has no more digits than T. [encoding]

  • The other which takes an (encoded) number R and outputs the original (unencoded) T and N. [decoding]

The complexity of the algorithms should at best allow to be implemented by modern computing. Apart from that, everything and anything is allowed :).

As a side note, do advise me if this is impossible.

1

There are 1 best solutions below

2
On BEST ANSWER

This is impossible for the same reason a compression algorithm that reduces the size of every file is impossible. Suppose $T$ has $k$ decimal digits. There are $9\times10^{k-1}$ possible values of $T$, and $5$ possible values of $N$, making $45\times10^{k-1}$ possible $(T,N)$ pairs. But since you're only allowed up to $k$ digits in $R$, there are only $10\times10^{k-1}$ possible values of $R$ to encode them in. So no matter what you do, you won't be able to decode $R$ back into $T$ and $N$.