I want to derive a LLN for conditionally independent random variables.
Let $(X_n)_{n \geq 1}$ be a sequence of random variables taking values on a finite subset $B$. Assume that their conditional probability is bounded above and below by numbers that are independent on the conditioning event.
Show convergence of $(X_n)_{n \geq 1}$.
Condition (1) is the only condition needed for your desired result to hold, as it already contains a notion of "approximately i.i.d.".
[Also, I don't understand your condition (2a): if $B$ is a finite set of real numbers, then how could a $B$-valued random variable fail to have finite first and second moments?]
Proof. I will show that $$ P\bigg(\liminf_{n\rightarrow \infty} \frac{1}{n}\sum_{k=1}^n 1_{[X_k = b]} \geq \nu_l(b) \bigg)=1; $$ a similar argument will show that $$ P\bigg(\limsup_{n\rightarrow \infty} \frac{1}{n}\sum_{k=1}^n 1_{[X_k = b]} \leq \nu_u(b) \bigg)=1. $$
Fix $b \in B$. If $\nu_l(b)=0$ then the desired result is trivial, so assume that $\nu_l(b) > 0$.
The strategy is to show that the desired result follows from the classical strong law of large numbers, by constructing a sequence of events $E_n$ to which classical LLN can be applied [that is to say, $(1_{E_n})_{n \geq 1}$ is an i.i.d. sequence of random variables], where $E_n \subset \{X_n=b\}$ for every $n$, and $P(E_n)=\nu_l(b)$.
Let $(U_n)_{n \geq 1}$ be an i.i.d. sequence of random variables uniformly distributed in $[0,1]$ that is independent of $(X_n)_{n \geq 1}$. [We can assume without loss of generality that the underlying probability space is rich enough to admit this.] Define a sequence of events $(E_n)_{n \geq 1}$ recursively as follows:
Obviously, by construction, $E_n \subset \{X_n=b\}$ for all $n$. It remains to show that the events $E_n$ are mutually independent with $P(E_n)=\nu_l(b)$ for all $n$.
Since $\nu_l(b) \leq P(X_1=b) = p_1$ and $U_1$ is independent of $X_1$, we have $$ P(E_1) = P(X_1=b)P\big(U_1 \leq \tfrac{\nu_l(b)}{p_1}\big) = p_1 \tfrac{\nu_l(b)}{p_1} = \nu_l(b). $$ Now given any $n \geq 2$, since $(U_1,\ldots,U_{n-1})$ is independent of $(X_1,\ldots,X_{n-1})$ we have $$ P(X_n=b | X_1,\ldots,X_{n-1},U_1,\ldots,U_{n-1}) = P(X_n=b | X_1,\ldots,X_{n-1}) \geq \nu_l(b), $$ and hence, since (by construction) $E_1,\ldots,E_{n-1} \in \sigma(X_1,\ldots,X_{n-1},U_1,\ldots,U_{n-1})$, we have that for every $(\alpha_1,\ldots,\alpha_{n-1}) \in \{0,1\}^{n-1}$ with $P(1_{E_i}=\alpha_i \ \forall \, 1 \leq i \leq n-1) > 0$, $$ P(X_n = b | 1_{E_i}=\alpha_i \ \forall \, 1 \leq i \leq n-1 ) \geq \nu_l(b) $$ and so, since $U_n$ is independent of $(X_1,\ldots,X_n,U_1,\ldots,U_{n-1})$ and therefore of $(E_1,\ldots,E_{n-1},X_n)$, \begin{align*} & \ P(E_n | 1_{E_i}=\alpha_i \ \forall \, 1 \leq i \leq n-1) \\ =& \ P(X_n=b \ \text{ and } \ U_np_n(1_{E_1},\ldots,1_{E_{n-1}}) \leq \nu_l(b) | 1_{E_i}=\alpha_i \ \forall \, 1 \leq i \leq n-1) \\ =& \ P(X_n=b \ \text{ and } \ U_np_n(\alpha_1,\ldots,\alpha_n) \leq \nu_l(b) | 1_{E_i}=\alpha_i \ \forall \, 1 \leq i \leq n-1) \\ =& \ P(X_n=b | 1_{E_i}=\alpha_i \ \forall \, 1 \leq i \leq n-1) P(U_np_n(\alpha_1,\ldots,\alpha_n) \leq \nu_l(b)) \\ =& \ p_n(\alpha_1,\ldots,\alpha_n)\tfrac{\nu_l(b)}{p_n(\alpha_1,\ldots,\alpha_n)} \\ =& \ \nu_l(b). \end{align*} Since this holds for every $(\alpha_1,\ldots,\alpha_{n-1})$ with $P(1_{E_i}=\alpha_i \ \forall \, 1 \leq i \leq n-1) > 0$, it follows that $E_n$ is independent of $(E_1,\ldots,E_{n-1})$ with $P(E_n)=\nu_l(b)$.