Suppose that $P$ is a random variable that takes values in the space of computer programmes. So, basically, $P$ is some code (e.g. C, Python, or some theoretical one like the one used in the tapes of the Turning machine).
Then, suppose that I reveal to you you the entropy of that code is $\mathrm{H}(P)$.
My question is: is it possible for the output of programme $P$ to have an entropy greater than $\mathrm{H}(P)$?
Letting $O=g(P,I)$ where $O$ is the output , $P$ is the program and $I$ is the input, and $g$ is a deterministic function, we have
$$H(O) \le H(P,I) = H(P) +H(I|P)$$
Hence the entropy of the output can exceed $H(P)$ only if the input itself has some entropy.
If the inputs of each program are deterministic, then crearly not.
If the inputs are random (i.e., the have their own entropy), then trivially yes. Say, we have a single program which is the identity function.