I have been working on Bishop's book, Machine Learning and Pattern Recognition. The page 73 says, the limit of an infinitely large data set $m, l \rightarrow \infty$, the result $\frac{m+a}{m+a+l+b}$ reduces to $\frac{m}{N}$. Where $m$ is number of $x = 1$ observations and $l$ is number of $x = 0$ observations. Hence, $N = m + l$ is the size of the data set.
But I did not get the limit part, how was the limit's result become $\frac{m}{N}$?
Thanks for answers.
Dividing by $m$, we get $$\lim_{m, l\to\infty} \frac{m+a}{m+l+a+b}=\lim_{m, l\to\infty} \frac{1+a/m}{(m+l)/m+a/m+b/m}=\lim_{m, l\to\infty}\frac{m}{m+l}=\lim_{m, l\to\infty}\frac{m}{N}$$ But I don't think the result is just $m/N$ as you mention.