This is my first post here. I am a musician, and not a mathematician, but I enjoy doing things to prime numbers and seeing what comes out.
I have defined a sequence which takes the following values for $n$:
- -1 if $n$ is prime
- 1 if $n$ is a practical number
- 0 if $n$ is neither or both
I have then taken a sequence of its partial sums. The first 50 terms are 1,1,0,1,0,1,0,1,1,1,0,1,0,0,0,1,0,1,0,1,1,1,0,1,1,1,1,2,1,2,1,2,2,2,2,3,2,2,2,3,2,3,2,2,2,2,1,2,2,2.
The plot for $n<100000$ looks quite linear:
In order to see quite how linear it was, I then divided each term of the sequence by $n$ and got this plot:
It seems to me like it wants to converge to some value. The arithmetic mean of the last 100 terms is 46.3225.
I vaguely understand that there are some analogies between practical numbers and primes. I am wondering how difficult it would be to establish if the above sequence does in fact converge, and if so, then to what value. I have tried it with other prime-like sequences, such as ludic numbers and lucky numbers, but the other ones didn't seem as neat...
Thanks!
The plot isn't actually linear, but it indeed looks like one. Here's why:
Let's introduce two functions, $\pi(x)$ and $p(x)$, that respectively count the number of prime and practical numbers less than $x$. A famous result in number theory, the prime number theorem, tells us that $$ \pi(x) \sim \frac{x}{\log(x)} \quad \text{for } x \to +\infty $$ which essentially means that for $x$ large enough $\pi(x)$ behaves almost exactly like $x/\log(x)$.1 Remarkably, just in 2015 Weingartner showed that $$ p(x) \sim \frac{cx}{\log(x)} \quad \text{for } x \to +\infty $$ for some constant $c > 0$.
The sequence of partial sums you defined is then simply the sequence $\{s(n)\}_{n=1}^\infty$ where $s(x) := p(x) - \pi(x)$. It follows that $$ s(x) \sim (c-1) \frac{x}{\log(x)} \quad \text{for } x \to +\infty. $$ To conclude, here's what $\frac{x}{\log(x)}$ looks like on the interval $[0,10^4]$:
1. The almost is important: $\pi(x)$ may never be equal to $x/\log(x)$, but after a while the error will grow quite slower than $x/\log(x)$.
Update: I should probably explicitly say something about your second graph, too, but first we need to give a precise meaning to "$\sim$". Given two functions $f,g \colon D \subseteq \Bbb{R} \to \Bbb{R}$ and $\rho \in \Bbb{R} \cup \{\pm\infty\}$, we say that $f \sim g$ for $x \to \rho$ if $$ \lim_{x \to \rho} \frac{f(x)}{g(x)} = 1. $$ Now, what can we say about the limit of the sequence depicted in your second graph? From the previous discussion we have $$ \begin{align} \lim_{n \to \infty} \frac{s(n)}{n} &= \lim_{n \to \infty} \frac{p(n)-\pi(n)}{n} \\ &= \lim_{n \to \infty} \frac{p(n)-\pi(n)}{n}\; \frac{\log(n)}{\log(n)} \\ &= \lim_{n \to \infty} \frac{p(n)-\pi(n)}{n/\log(n)}\; \frac{1}{\log(n)} \\ &= \left(\lim_{n \to \infty} \frac{p(n)-\pi(n)}{n/\log(n)}\right) \left(\lim_{n \to \infty}\frac{1}{\log(n)}\right) \\ &= (c-1) \lim_{n \to \infty}\frac{1}{\log(n)} \\ &= 0 \end{align} $$ So $s(n)/n$ does indeed converge to $0$, but I don't know how fast.