How to infer the smallest possible size of the used sample based on the %?

25 Views Asked by At

I'm not a mathematician, so sorry for a confusing title. Let me explain.

For example, I'm shown a pie chart and it is split to 50% and 50%.

So logically, the smallest possible sample to achieve that would be 2. 1 and 1 = 50% and 50%.

What about if I'm given a pie chart with 70%, 15.87% and 14.13%. How would I find out the smallest possible sample size, for which I could achieve these numbers?

So if someone shows me this pie chart without telling me the sample size, I can find out if their sample size was large enough (let's say it's only possible to achieve such a "precision" by dividing numbers above 500)

I mean naturally, to achieve the numbers I mentioned above, the sample size cannot be small, as you wouldn't achieve such a split with let's say a sample size of 5. The numbers are too "weird" for that (again, sorry for the non-scientific description...)