Probability theory problem , inequality

55 Views Asked by At

Let $n$ be a positive integer and $a , b$ two randomly chosen positive integers not exceeding $n$. Let $P(n)$ denote the probability that $\begin{equation} \\ a ^ 2 + b ^ 2\leq n^2 \end{equation}$. How can I find the value of $P(n)$ when $n$ tends to infinity? I know that it is easier to find the limit than the actual probability.

1

There are 1 best solutions below

0
On

In virtue of the Gauss circle problem, the limit probability is just $\frac{\pi}{4}$.