It is said that sieve methods have parity problems.
Terence Tao gave this "rough" statement of the problem:
"Parity problem. If A is a set whose elements are all products of an odd number of primes (or are all products of an even number of primes), then (without injecting additional ingredients), sieve theory is unable to provide non-trivial lower bounds on the size of A. Also, any upper bounds must be off from the truth by a factor of 2 or more."
Does this problem only happen to Selberg Sieve or does it also happen to all other type of sieve methods ?
Say "sieve theory" refers to theorem of following shape:
Given $a_1,a_2,... \ge 0$ and some bounds for the size of $S_d = \sum a_{kd}$ for each $d>0$, then one has a bound for $\sum_{p \in A} a_p$ ($A$ is some sieved set).
The Selberg sieve is one theorem of this sort. The parity problem applies not only to the Selberg Sieve, but to any theorem of this kind.
The heuristic roughly is that if the set $A$ has only integers for which the parity of the number of prime factors is predictable (say always odd), if sequence $a_n$ is replaced by $a_n(1+\mu(n))$, the sums $S_d = \sum a_{kd}$ stay roughly the same (from pseudorandomness of the Mobius function), but $\sum_{p \in A} a_p$ is zero, so the information about the size of $S_d$ can't entail $\sum_{p \in A} a_p$ being non-zero. Analogously, taking $a_n(1-\mu(n))$ we get that a theorem of this sort can't get a bound better than twice the optimal for $\sum_{p \in A} a_p$.