I still need to be disabused of the belief that there is some simple connection between the finiteness of the radius of convergence and the asymptotic rate of growth.
1. Can we develop any specific criteria based on the asymptotic growth rate to determine when the radius of convergence is finite or infinite?
2. Does $f$ analytic with $f=\mathscr{O}(1)$ imply that it has infinite radius of convergence?
3. Does $f$ analytic with $f=\omega(\ln(x)) \land f=\mathscr{O}(e^x)$ imply that $f$ has infinite radius of convergence?
4. (Converse of 2. and 3.) Do the logarithm and exponential represent lower and upper bounds, respectively, for rates of growth which an analytic function diverging at infinity may have such that its radius of convergence is infinite?
And thus all other analytic functions with infinite radius of convergence are $\mathscr{O}(1)$, i.e. bounded?
I recognize that the answers to the second through fourth questions are almost certainly no, but to what extent, if any, are they almost true? It seems like some not-so-much-weaker version of them might be true. The motivation for these conjectures/these questions is probably best illustrated through some examples:
1. The Gamma Function $\Gamma$ - A super-exponential function with finite radius of convergence.
Taking logarithms, I believe that the Taylor expansion of the logarithm of the gamma function implies that $\ln \Gamma = \Omega(\ln(x))$, whereas obviously $\ln e^x = \mathscr{O}(1)$.
2. The Logarithm - A sub-polynomial function with finite radius of convergence
Note that $\ln(x)=o(x^{\epsilon})$ for any $\epsilon > 0$. Moreover, $\int_1^{\infty} \frac{1}{t^{1+\epsilon}} \text{d}t <\infty$ for any $\epsilon >0$, whereas $\ln(x)=\int_1^x \frac{1}{t} \text{d}t$ diverges as $x \to \infty$. Hence the logarithm seems to represent some sort of fundamental "border" function for the polynomials, which obviously all have infinite radii of convergence and are actually exactly the class of all functions with terminating (finite) Taylor series.
3. Sine and Cosine - Two functions with infinite radii of convergence which are $\Theta(1)$.
4. Polynomials - Infinite radii of convergence, terminating Taylor series, and $\omega(\ln(x))$ and $o(e^x)$.
5. Other Trigonometric Functions ($\tan,\cot,\sec,\csc$) - Finite Radii of Convergence, and satisfying $\omega(e^x)$ as $x \to \frac{\pi}{2}$ or $x \to \pi$.
6. Bessel Functions of the First or Second Kind - these functions are bounded and analytic and (coincidentally?) also have infinite radius of convergence.
7. A Possible Counterexample?
As @Wojowu suggested, consider the function $\Xi(x):=\displaystyle \sum \frac{x^n}{(n!)^2}$
Does $\Xi(x)$ have finite or infinite radius of convergence? Does $\Xi(x)=o(e^x)$ or $\Xi(x)=\Theta(e^x)$? (I am assuming that $\Xi(x)=\mathscr{O}(e^x)$, but I could be wrong.)
Background:
This is a follow-up to a question I asked previously: Is Every (Real) Analytic Function (with Non-Degenerate MacLaurin Series) Asymptotically Greater Than any Polynomial?
The asymptotic rate of growth of an entire (everywhere analytic function) will be determined both by which of its terms vanish as well as the signs of the terms.
However, as comparing the examples of $\Gamma, \sin,$ and $\ln$ show, no simple criteria based solely on the number of vanishing terms nor on their signs is enough to make definitive conclusions about asymptotic rate of growth (although there is almost certainly some complicated connection).
For an arbitrary sequence $a_n$, there is an entire function $f$ which interpolates the sequence: $f(n) = a_n$ for all $n \in \mathbb N$. This follows from Mittag-Leffler's theorem and the Weierstrass factorization theorem. See also this paper of I.M. Sheffer