Let $V$ be a finite dimensional vectors space with inner product $(\cdot, \cdot)$, and $X$ a random variable in $V$. If $E[(b_i, X)^2] \lt \infty$ for all $i = 1\dots n$, then $\text{Cov}(X)$ exists.
The definition of $\text{Cov}(X)$ that I'm using is that if there exists a non-negative definite linear transformation $\Sigma : V \to V$ such that
$$ \text{cov}\{(x, X), (y,X)\} = (x, \Sigma y), \\\forall x,y \in V $$ then $\Sigma \equiv \text{Cov}(X)$.
Each $x$ (sim. $y$) may be composed as $x = \sum_i x_i b_i$. So that $(x,X) = \sum_i x_i (b_i, X)$ and $$ \text{cov}\{(x,X), (y,X)\} = \\ E[(x,X)(y,X)] - E[(x,X)]E[(y,X)] = \\ E[\sum_{i,j} x_i y_j (b_i, X)(b_j, X)] - (x, E[X])(y, E[Y]). $$
Where the last term is derived from the fact that $(x, X)$ has finite expectation when each $(b_i, X)^2$ does.
Not sure how to proceed or use more the fact that $E[(b_i, X)^2] \lt \infty$.
You have already noted that $|E[(b_i, X)]| < \infty$ due to $E[(b_i, X)^2] < \infty$ for each $i$.
With $x=\sum_i x_i b_i$ and $y = \sum_i y_i b_i$, you can check that $$\text{cov}\{(x,X), (y, X)\} = \text{cov}\left\{\sum_i x_i (b_i, X), \sum_j y_j (b_j, X)\right\} = \sum_i \sum_j x_i y_j \text{cov}\{(b_i, X), (b_j, X)\}$$ holds.
Then it remains to check that each covariance term is finite by using the moment conditions $E[(b_i, X)^2] < \infty$ and $|E[(b_i, X)]|<\infty$. The Cauchy-Schwarz inequality $E[(b_i, X)(b_j,X)] \le \sqrt{E[(b_i,X)^2] E[(b_j, X)^2]}$ may be helpful. Then, the desired matrix $\Sigma$ has entries $\Sigma_{ij} := \text{cov}\{(b_i, X), (b_j, X)\}$.
Regarding nonnegative-definiteness: you want to check that $(x,\Sigma x) \ge 0$ for any $x$. But by your work this is simply $\text{var}((x,X))$.