Let $(X,\mathscr{B},\mu,T)$ be a measure-preserving system and let $\xi$ be a partition of $X$ with finite entropy. Then the entropy of $T$ with respect to $\xi$ is $$h_\mu(T,\xi)=\lim_{n\to \infty}\frac{1}{n}H_\mu(\bigvee_{i=0}^{n-1}T^{-i}\xi)=\sup_{n\to \infty}\frac{1}{n}H_\mu(\bigvee_{i=0}^{n-1}T^{-i}\xi).$$ The entropy of $T$ is $$h_\mu(T)=\sup_{\xi:H_\mu(\xi)<\infty}h_\mu(T,\xi).$$
Let $h$ be a nonnegative number. I wonder if there always exists a measure-preserving transformation with $h$ as its entropy.
For each $h>0$ there is a Bernoulli shift $B(h)$ with entropy $h$. (Details and further info in the wikipedia article. The hard-math part here is finding a $k$ such that $h<\log k$ and then finding probabilities $p_1,\ldots,p_k$ such that $h=-\sum p_i\log p_i$.)
(The construction is basically to let the coordinates be independent identically distributed copies of a random variable which takes the value $i$ with probability $p_i$, where $T$ is the shift. Because of independence, the $H_\mu(\bigvee_{i=1}^{n-1} T^{-i}\xi)$ is just the entropy of an $n$ tuple whose coordinates are independent, when $\xi$ is the partition induced by a single coordinate. Because of the multiplicative property of independence, and the way logarithms work, all the $ H_\mu(\bigvee_{i=1}^{n-1} T^{-i}\xi)$ terms evaluate to $nh$, so for that $\xi$, $H_\mu(T,\xi)=h$. And so on. Billingsley's book Ergodic Theory and Information has details.)
You can pick a single measure space, say $S=([0,1],\mathcal B, \lambda)$ with Lebesgue measure and, for any given $h$, find a measure-theoretic isomorphism between $B(h)$ and $S$, and use it to make a $T$ such that $([0,1],\mathcal B, \lambda,T)$ does what you want. (Conjugate the shift on $B(h)$ by the isomorphism.)
I don't quite understand what your comment "I hope this proposition is true for a general measure space" means, but maybe this is good enough for you.
I am also puzzled by how you came across this definition of a shift's entropy without also coming across examples of ergodic processes such as the Bernoulli shift, which are to information theory and ergodic theory as triangles are to Euclidean geometry. Most textbooks talk about the Kolmogorov Sinai definition and then about the Ornstein theorem, by which point your question should be obvious.