Today, I suddenly thought of a question:
Find the probability that a $n$-degree polynomial has $r$ real roots. ($0\le r\le n$)
So, I created a function $P\left(r;n\right)$ which is the probability that a $n$-degree polynomial has $r$ real roots. Then, I try some small values.
For $n=1$, it is easy to see that $P\left(0;1\right)=0$ and $P\left(1;1\right)=1$
For $n=2$, we need to consider the determinant $\Delta$ of $x^2+ax+b$. (It has the same probability ratio as $ax^2+bx+c$)
When $\Delta \le 0$, which means $b \le \dfrac{a^2}{4}$, the probability of choosing $a,b$ with such condition is $0$, so $P\left(0;2\right)=P\left(1;2\right)=0$. Similarly, $P\left(2;2\right)=1$
Then, I have a guess: $P\left(0;n\right)=P\left(1;n\right)=\cdots=P\left(n-1;n\right)=0$ and $P\left(n;n\right)=1$. However, it consists of high-degree polynomial so I can't prove my guess. Is there anyone can help me. Tips are free, thank you!
As pointed out in the comments you need to specify a probability distribution for this question to be meaningful. For example, here is one way of generating a random polynomial of degree $n$: let $(Z_1,\ldots,Z_n)$ be any random vector of real numbers and consider the random polynomial $$ p(x)=(x-Z_1)\cdots (x-Z_n). $$ You can write out the coefficients in terms of the $Z_1,\ldots,Z_n$ to see that they are truly random (except for the $x^n$ which has coefficient $1$) so it meets your criteria, and all the probabilities match your guess: $0$ probability of having $r<n$ real roots, probability $1$ of having $n$ real roots.
On the other hand, we can also consider the random polynomial $$ q(x)=(x^2+Z_1^2+1)\cdots (x^2+Z_n^2+1) $$ which has no real roots, and again you can multiply out the coefficients and see they are random.
Now I guess that neither of these two examples match what you are actually interested in, which is the case where all the coefficients are drawn randomly from the same distribution and independently of each other. Now the question is more interesting and has a lot of research literature that is easily accessible (for example here https://arxiv.org/abs/1409.4128) and some less accessible literature going back over 50 years. While the papers may be difficult to read if you are not familiar with probability theory at the graduate level, the results are easy to summarize: in most cases people have studied, the expected number of real roots is quite a lot smaller than $n$, approximately of size $\log n$, and as a result there is a very low probability of having $n$ real roots especially as $n$ grows larger and larger. The precise asymptotic is that the expected number of real roots grows as $$ \bigl(\tfrac{2}{\pi}+o(1)\bigr)\log n\text{ as }n\to\infty, $$ where the notation $o(1)$ denotes a quantity that converges to $0$ as $n\to\infty$.