I am a novice in information theory so this is more of a question seeking pointers to ideas/references to think further on the thought. I want to make concrete the idea that a function of two variables typically contains more information than that of one variable.
I'll try to turn that into a statement: Suppose we have a sigma algebra $\Sigma$, and that $\Sigma$ is the minimal sigma algebra a function $f$ is measurable wrt. Likewise, let $\Sigma \times \Sigma$ be the minimal sigma algebra a function $g$ is measurable wrt. Can we say in some sense a "typical" $g$ has higher entropy than a "typical" $f$?