Consider a general purpose lossless data compression algorithm, It compresses a randomly generated binary file of 100MB size, with random I mean I wrote a small Script to create a file with random binary, with entropy E(measured in BPS(Bits per symbol)) to another file, In which each symbol is represented with X bits, i.e BPS = X.
Such that X - E = 0.4xxxxx.
Now the Question is Is this a Good, better, Best, Or Worse algorithm ?
I'm not sure the question is meaningful without more details, such as file size. There are algorithms, such as LZ77, which can approach the entropy, on the limit to infinity. That means, for a very large file, LZ77 can approach $X-E=0$.
For smaller file sizes it really depends on the probability distribution. For example, if all probabilities are powers of 2, then the Huffman code can also achieve $X-E=0$.