Constant random variables are always independent. Why? Let $R$ and $S$ be constant random variables. For sets $A$ and $B$, we have
$$P(R\in A,S\in B) = \boldsymbol{1}_{\{R\in A,S\in B\}} = \boldsymbol{1}_{\{R\in A\}}\boldsymbol{1}_{\{S\in B\}}=P(R\in A)P(S\in B).$$
In the first equality, I consider the probability $P(R\in A, S\in B)$. Since $R$ and $S$ are constants, either $R$ is in the set $A$, or it is not, and likewise for $S$. This means that either it is definitely true that $R\in A$ and $S\in B$, in which case the probability is one (it is a sure event, regardless of which $\omega$ we consider), and otherwise it is not true, in which case the probability is $0$. I can express this as an indicator function on the event $\{R\in A\}\cap \{S\in B\}$.
For the second equality, I use that for any sets $C$ and $D$, it is true that
Constant random variables are always independent. Why? Let $R$ and $S$ be constant random variables. For sets $A$ and $B$, we have $$P(R\in A,S\in B) = \boldsymbol{1}_{\{R\in A,S\in B\}} = \boldsymbol{1}_{\{R\in A\}}\boldsymbol{1}_{\{S\in B\}}=P(R\in A)P(S\in B).$$
In the first equality, I consider the probability $P(R\in A, S\in B)$. Since $R$ and $S$ are constants, either $R$ is in the set $A$, or it is not, and likewise for $S$. This means that either it is definitely true that $R\in A$ and $S\in B$, in which case the probability is one (it is a sure event, regardless of which $\omega$ we consider), and otherwise it is not true, in which case the probability is $0$. I can express this as an indicator function on the event $\{R\in A\}\cap \{S\in B\}$.
For the second equality, I use that for any sets $C$ and $D$, it is true that
$\boldsymbol{1}_{C\cap D} = \boldsymbol{1}_C\boldsymbol{1}_D.$