Suppose ${\bf p} = [p_1, \dots, p_n]^T$ and ${\bf q} = [q_1, \dots, q_n]^T$ be mutually independent $\mathbb{C}^{n \times 1}$ vectors whose elements are i.i.d zero-mean and unit variance random variable (RVs), i.e., $\mathbb{E}\{|p_i|^2\} = \mathbb{E}\{|q_i|^2\} = 1$. Then from law of large numbers we have $\frac{1}{n} {\bf p}^H {\bf q} \to 0$.
Now let ${\bf D} \in \mathbb{R}^{n \times n}$ be a diagonal matrix with different real entries, i.e., $d_{n}$. Then can I use law of large numbers to find convergence for:
\begin{equation} \frac{1}{n} {\bf p}^H {\bf D}{\bf q} \end{equation}
I am interpreting the question as follows:
Before answering the question, I just want to point out that a routine calculation using independence of the $p_i$ and $q_j$ shows that that $\mathbb EX_n=0$ and $ \mathbb E[X_n^2]=\sum_{i,j=1}^nd_{ij}^2=\|D_n\|_2^2. $ This shows that $n$ is the wrong scaling in the statement above, rather it should be $\|D_n\|_2^2$ in general (note that $\|D_n\|_2^2=n$ when $D_n$ is the identity matrix). For example, when $d_{ij}=1$ for all $i,j$ the quantity $X_n/n$ has infinite variance in the limit. In fact, in this case the variance of $X_n$ grows like $n^2$, not $n$, since we are summing $n^2$ independent quantities (unlike the case of the identity matrix, when there are only $n$).
Now let's answer the corrected version of the question.
Proof: By Chebyshev's inequality, $$ \mathbb P\Bigl(\frac{X_n}{\|D_n\|_2^2}>\epsilon\Bigr)\leq \frac{1}{\epsilon^2\|D_n\|_2^2}. $$ By assumption, this tends to $0$ as $n\to\infty$, for all $\epsilon >0$, proving the claim.
Note, the assumption $\|D_n\|_2\to\infty$ is necessary. For example, if $d_{ij}=0$ whenever $i$ or $j>N$ then $X_n$ stays random in the $n\to\infty$ limit, so no convergence is possible.
Note. When $D_n$ is assumed to be a diagonal matrix, it may still not be the case that $\|D_n\|_2\to\infty$, for example if $d_{n,n}=n^{-2}$. Likewise, it can also be the case that $\|D_n\|_2$ grows asymptotically faster than $n$. So assuming the matrix is diagonal does not reduce the number of assumptions one has to make.