Given the definition of Fisher information matrix $$ I_{\theta} = \mathbb{E}_\theta[\nabla_{\theta}\log p_\theta(x)\nabla_{\theta}\log p_\theta(x)^T] $$
and the definition of KL divergence $$ D_{KL}(p_{\theta_1}||p_{\theta_2}) = \mathbb{E}_{\theta_1}\bigg[\log\frac{p_{\theta_1}}{p_{\theta_2}}\bigg] $$
I am seeking a proof for the identity $$ I_{i, j} = \frac{\partial}{\partial \theta_i}\frac{\partial}{\partial \theta_j}D_{KL} $$ i.e. $ I = \nabla^2D_{KL} $
You are stating the identity using incorrect notation, which is probably the reason you cannot proceed with the proof. The correct statement of the identity appears in the wiki article for the Fisher information matrix, namely,
$$ I_\theta = \nabla_{\theta'}^2D_\text{KL}(\theta \| \theta') \mid_{\theta'=\theta} \text{ (*)}, $$ i.e., the Fisher information matrix equals the Hessian of the function $\theta' \mapsto D_\text{KL}(\theta\|\theta')$, evaluated at $\theta'=\theta$,
where $$ D_\text{KL}(\theta\|\theta') \triangleq \int_x p_\theta(x)\log \frac{p_\theta(x)}{p_{\theta'}(x)}dx. $$
When certain "regularity" conditions hold (related to exchanging the order of differentiation and integration), the Fisher information matrix can be equivalently expressed as (see wiki) $$ I_\theta = -\int_x p_\theta(x)\left(\nabla_\theta^2 \log p_\theta(x) \right) dx, $$
which is trivial to see that it is equal to the right-hand side of (*).