So I have x number of items, each one has its own appearance ratio for making a pattern, of which I'm trying to determine the smallest possible pattern. The appearance ratios for all items total up to 1.
This is a bit of a complicated question to get my head around so I'll try and explain with examples…
For example there may be four letters (a, b, c, d) with respective ratios of 0.4, 0.3, 0.2, 0.1, which would require a minimum pattern size of 10 (as 0.1 is the smallest ratio, and 0.1 / 1 = 10). Resulting pattern would be aaaabbbccd.
In another example, of 0.4, 0.4, 0.2 we would get a minimum pattern size of 5 (aabbc).
However an example of 0.3, 0.3, 0.4 would require a pattern size of 10 (aaabbbcccc) because the 0.4 cannot be divided by the 0.3 without resulting in a decimal.
What algorithm could calculate the pattern size based on the input ratios?
As Qiaochu Yuan posted, what you're looking for is the least common denominator of the ratios involved. Many thanks to him - the post I previously had here did not deal with ratios whose decimal expansions are infinite. I have corrected this post accordingly. We can find the LCD of a set of ratios writing the ratios as fractions with integral numerators and any common denominator, and then dividing the denominator by the gcd of these numerators.
For example, with $.3, .3, .4 = \frac{3}{10}, \frac{3}{10}, \frac{4}{10}$, gcd$(3, 3, 4) = 1$, so the pattern size is calculated to be $\frac{10}{1} = 10$.
By contrast, for the example $.2, .4, .4 = \frac{2}{10}, \frac{4}{10}, \frac{4}{10}$, gcd$(2, 4, 4) = 2$, so the pattern size is calculated to be $\frac{10}{2} = 5$.
For an additional example, $.35, .35, .3 = \frac{35}{100}, \frac{35}{100}, \frac{30}{100}$, gcd$(35, 35, 30) = 5$, so the pattern size is calculated to be $\frac{100}{5} = 20$. The pattern is, correspondingly, (${\bf aaaaaaabbbbbbbcccccc}$)