Suppose I have a variable $C$ and a set of $N$ variables $L_1, L_2, ..., L_N$ that are distributed according to:
$$\begin{aligned} C &\sim \mathcal N \left(\mu, \delta^2 \right) \\ \forall i. L_i | C=c &\sim \mathcal N\left(c, \sigma_i^2 \right) \end{aligned}$$
That is, conditional on knowing the value of $C$, the $L_i$ are normally independently distributed with mean $c$ and variance $\sigma_i^2$. This implies that the vector $\vec L = (L_1, ..., L_N)^T$ is distributed according to a N-dimensional multivariate normal with mean vector $\vec \mu$ and covariance matrix $\mathbf \Sigma$.
When $N=1$, it is straightforward to see that $L_1 \sim \mathcal N \left(\mu, \delta^2 + \sigma_1^2 \right)$. It seems to me like this should imply that $\vec \mu = (\mu, ..., \mu)^T$, and that the diagonal elements of $\mathbf \Sigma$ should be $\delta^2 + \sigma_i^2$. The off-diagonal elements of the covariance matrix are, of course, not zero, since the variables $L_i$ are not independent when I'm not conditioning on $C$.
Is this true, though? Is there a way for me to prove this in a less tedious way than algebraically working everything out? What are the values of the off-diagonal elements?
You can think of the $L_i$ as being sums of the form $L_i = C + X_i$, where $X_i$ are independent random variables with $X_i\sim\mathcal N(0,\sigma_i^2)$ that are also independent of $C$. Therefore, for $\mathcal L = (L_1,\dots,L_N)^T$, $\mathcal C = (C,\dots,C)^T$ and $\mathcal X = (X_1,\dots,X_N)^T$, you have $\mathcal L = \mathcal C + \mathcal X$. Note that $$ \mathcal X\sim \mathcal N\left( \begin{pmatrix} 0\\ \vdots\\ 0 \end{pmatrix} , \begin{pmatrix} \sigma_1^2 & & 0 \\ &\ddots& \\ 0& & \sigma_N^2 \end{pmatrix} \right), \qquad \mathcal C\sim \mathcal N\left( \begin{pmatrix} \mu\\ \vdots\\ \mu \end{pmatrix} , \delta^2 \begin{pmatrix} 1 &\cdots & 1 \\ \vdots&\ddots&\vdots \\ 1& \cdots & 1 \end{pmatrix} \right) $$ Therefore, $$ \mathcal L\sim \mathcal N\left( \begin{pmatrix} \mu\\ \vdots\\ \mu \end{pmatrix} , \begin{pmatrix} \delta^2+\sigma_1^2 &\cdots & \delta^2\\ \vdots&\ddots&\vdots \\ \delta^2& \cdots & \delta^2+\sigma_N^2 \end{pmatrix} \right) $$ So, your reasoning is correct and the off-diagonal elements are all equal to $\delta^2$.