For the following experiment:
A random number $X$ is chosen uniformly from $[0, 1]$. Then a sequence $Y_1,Y_2\dots Y_i$ of random numbers is chosen independently and uniformly from $[0, 1]$. The game ends the first time that $Y_i > X$.
If I define $Z$ as a discrete (countably infinite) random variable $\{1,2,3...\}$ that represents the number of turns before the game ends (inclusive of the last turn), $P(Z=z|X=x)=(1-x)x^{z-1}$.
The marginal probability mass function for $Z$ is: $p_Z(z)=\int_{0}^{1}P(Z=z|X=x)f_X(x)dx=\int_{0}^{1}(1-x)x^{z-1}dx$?
Is $E(Z|X=x) = \frac{1}{1-x}$, and hence $E(Z) = \int_{0}^{1}E(Z|X=x)f_X(x)dx=\int_{0}^{1}\frac{1}{1-x}dx$ which diverges?
How is $f_{X,Z}(x,z)$ defined, given that it is neither continuous nor discrete?
The joint distribution-mass distribution function of $\ X\ $ and $\ Z\ $ is given by $$ P\left(X\le x, Z=z\right) =\cases{ 0 & for $x<0 $ \\ \int_\limits{0}^x \left(1-y\right)y^{z-1}dy=\frac{x^z}{z}-\frac{x^{z+1}}{z+1} & for $0\le x<1 $\\ \frac{1}{z(z+1)} &for $1\le x $}\ ,$$ and their joint density-mass distribution function $\ f_{X,Z}\ $ (assuming that's what you meant by this expression) is obtained by differentiating this with respect to $\ x\ $: $$ f_{X,Z}\left(x,z\right)=\cases{ 0& for $x<0$\\ x^z-x^{z+1} & for $0\le x<1 $\\ 0 & for $1\le x $} $$
Calculating $\ E\left(Z\right)\ $ directly: $$\sum_\limits{z=1}^\infty zp_Z\left(z\right)=\sum_\limits{z=1}^\infty z\left( \frac{1}{z(z+1)}\right)=\sum_\limits{z=1}^\infty\frac{1}{z+1}$$ confirms your conclusion that it diverges (to $+\infty$).