Can there be an algorithm such that, given plaintext data P,Q, and compression function e,
Such that if we treat P and Q as a number (a series of bits): $$\begin{eqnarray*}e(P + Q)& =& e(P) + e(Q)\\ e(P*Q) &=& e(P)*e(Q)\end{eqnarray*}\hspace{10pt}?$$
The idea is closely related to homomorphic encryption, but instead of information security, the context is is compressing data while preserving malleability. Is there a theoretical limit to how much data can be compressed while maintaining the homomorphic property?
To clarify, let P and Q be binary strings (aka numbers expressed in binary), and $$log_2(e(u)) \le log_2(u)$$ for any number $u$.
It works with lossy compression. Let $e_n\colon\mathbb{Z} \to \mathbb{Z}_{2^n}$. Then $$e_n(P +_\mathbb{Z}Q) = e_n(P) +_{\mathbb{Z}_{2^n}} e_n(Q)$$ $$e_n(P \cdot_\mathbb{Z}Q) = e_n(P) \cdot_{\mathbb{Z}_{2^n}} e_n(Q)$$