All you data scientists will probably know the entropy equation:
$$H(p)=-\sum_{i=1}^{n}{{p}_{i}}\cdot\log_{2}{{p}_{i}}$$
And, using this, I was messing around with some compression, and calculated the entropy for a set of probablilties $\{0.3, 0.2, 0.2, 0.1\}$, which came out as about $2.246$.
This doesn't make sense to me, because if Entropy $\propto$ 1/Compression, then I've done the impossible by compressing data with these proportions.
I find myself confused as to how to interpret this value any other way. Is it bits per arbitrary unit? Am I simply wrong?
Your calculation is wrong. The correct answer for four digits is
$$1.7820 .$$
The correct interpretation of the entropy is as follows.
Assume that you have to code an infinite sequence of independent random variables distributed identically; taking different values (symbols) with the given probabilities. Then create disjoint sequence of length $N$ out of symbols poured by the source. Use some faithful coding method. Let the random variable $X_{N,n}$ be the length of the $n^{th}$ sequence of length $N$. Then for any method the average code length will be greater or equal than the entropy:
$$\frac{1}{N}E[X_{N,n}]=\frac{1}{N}E[X_{N,1}]\ge H.$$
(If $N\rightarrow \infty$ the average code length tends to the entropy.)