Suppose I have a SDE of the form:
$$dx_i = x_i\left(b_i-\sum_{j=1}^n a_{ij}x_j\right) \,dt + \sigma_i x_i \, d\eta(t)$$ where $\eta$ solves the Ornstein-Uhlenbeck process:
$$d\eta(t) = \lambda \eta(t)\, dt + \sigma \, dW(t)$$
I would like to show the system has bounded moments, so two quantities I'm interested in are: $x_i(t)^p$ or $\sum_{i=1}^n x_i(t)^p$ for $p \geq 1$
I have managed thus far to get for each one:
$$E\left[x_1(t)^p\right] = x_1(0) + E\left[ \int_{0}^{t} px_1(s)^p\left(b_i-\sum_{j=1}^{n}a_{1j}x_j +\frac{1}{2}(p-1)\sigma_1^2\right) ds \right] - E\left[\int_0^t \lambda px_1(s)^p\eta(s) ds \right] $$
$$E\left[e^t \sum_{i=1}^n x_i(t)^p \right] \leq \sum_{i=1}^n x_i(0)^p + E\left[\int_0^t Ke^s \, ds \right] - E\left[\int_0^t \lambda pe^s \eta(s) \sum_{i=1}^n x_i(s)^p \, ds \right]$$
I could easily finish this problem if I can show either that $E\left[\int_0^t \lambda p\eta(s) x_1(s)^p \, ds\right] > 0$ or $E\left[\int_0^t \lambda pe^s \eta(s) \sum_{i=1}^n x_i(s)^p \, ds\right] > 0$ (since I can easily bound things by scrapping that remaining term to get bounded behavior). However, I don't see a way to do this. It would be ideal to use Fubini's Theorem to move the expectation to the inside, but then I'd run into a problem where $\eta$ and $x_i$ are not independent so I don't really have a way to deal with the product of dependent random variables.
Since $E[\eta(t)] = 0$, it should be possible to get something like:
$$E\left[\int_0^t \lambda pe^s \eta(s) \sum_{i=1}^n x_i(s)^p \, ds\right] = \int_0^t \lambda pe^s \times Cov(\eta(s), \sum_{i=1}^n x_i(s)^p) ds$$, and now the sign is determined by the sign of the covariance. Thus, if there is a way to determine if the covariance is positive, I'd be done.
I've also tried using the substitution $y = \log x$ to convert the problem to an additive noise case, but I'm not sure that makes the problem better.