I have seen the following variant of big-O notation written in a textbook (page 1 of Barbour, Holst, and Janson): $f(m) = \mathcal{O}(g(m), h(m))$ as $m \to \infty.$ Is this notation standard? What does it mean?
A discussion with my classmates has led to two guesses: it might mean $f(m) = \mathcal{O}(\max \{f(m),g(m) \}),$ or else $f(m) = \mathcal{O}(\min \{f(m),g(m) \}),$ as $m \to \infty.$