Inspired by this question, consider hints on a Sudoku board. A regular puzzle has a unique solution. It is clear that there are puzzles with 2 or 3 solutions, and therefore, I guess, puzzles with say 4, and 6 solutions.
Now, what is the smallest integer $k$ such that there is no set of Sudoku clues resulting in exactly $k$ solutions?
Consider the process of randomly generating single solution puzzles and randomly dropping one or more clues as shown in the pseudo-code below:
Interestingly, though maybe not surprisingly, this process gives rise to an approximately lognormal distribution of solution counts. Here's a chart showing the empirical distribution (black) of solution counts from a sample of 15 million random puzzles with 1 dropped clue vs. the fitted lognormal model (green).
And here's the same chart for another 15 million random puzzles, this time dropping 4 clues:
This behavior suggests an absence of the kinds of pattern in solution count probability that might conspire to produce a "low" k. Maybe it also hints at a way to at least guess at the order of magnitude of k. (e.g., choose a useful way to partition the space of possible puzzles, estimate the number of distinct non-equivalent puzzles in each partition, find a way to sample from these puzzles uniformly, fit lognormal models for each partition, given lognormal model and size of sample space, ask in what range does it become likely that some count is zero).