I have an implementation of Sieve of Sundaram in C# that returns all odd primes as follows:
public static IEnumerable<BigInteger> SieveOfSundaram(BigInteger n)
{
IEnumerable<BigInteger> Candidates()
{
for(BigInteger i = 1; i <= n; i++)
{
for(BigInteger j = i; i + j + 2*i*j <= n ; j++)
{
var x = i + j + 2*i*j;
yield return x;
}
}
};
IEnumerable<BigInteger> GenerateList()
{
for(BigInteger i = 1; i <= n; i++)
{
yield return i;
}
}
IEnumerable<BigInteger> Sieve()
{
return GenerateList().Except(Candidates()).Select(x => x+x+1);
}
return Sieve();
}
The asymptotic complexity of the Sieve of Sundaram is supposed to be $O(n \log n)$. However, I see that the Candidates() routine runs $O(n \log n)$ and the GenerateList() runs $O(n)$ and in my Sieve implementation, the set exclusion is done by the GenerateList().Except(Candidates()) which would mean $O(n^2 \log n)$.
What am I doing wrong?
I figured it out. The standard algorithm implementation uses space of $O(n)$ to perform the set exclusion operations to ensure time complexity of $O(n \log n)$. It does an $O(n \log n)$ pass to mark entries to skip and then does a separate $O(n)$ pass to return the odd primes.
Asymptotically, $O(n + n \log n) \approx O(n \log n)$.