I came across these two introductory examples on the topic of computable. $$f_1(n) = \begin{array}{cc} \Bigg \{ & \begin{array}{cc} 1 & ,\text{if n appears in the decimal representation of} \ \pi \\ 0 & ,\text{otherwise} \end{array} \end{array}$$
$$f_2(n) = \begin{array}{cc} \Bigg \{ & \begin{array}{cc} 1 & ,\text{if there are n-many 1's that appear in the decimal representation of} \ \pi \\ 0 & ,\text{otherwise} \end{array} \end{array}$$
According to my lecture, $f_1(n)$ is not computable but $f_2(n)$ is computable. $f_1(n)$ is not computable because no matter how many decimal places of $\pi$ we figure out, we can never be sure if $n$ will appear in there. That's because $\pi$ is irrational and just because it doesn't appear in the first $10^{15}$ decimal places, there is no guarantee that $n$ won't appear after it.
However, by the same argument, shouldn't $f_2(n)$ also be not computable? If in the first $10^{15}$ decimal places we can't find $n$-many $1$'s, there is no guarantee that we can't find it later. But this is wrong. $f_2(n)$ is indeed computable.
How can that be?