I am interested about the null space of the operator $\mathrm{D_x} \mathrm{D_y}$ on the space $\mathcal{D}'(\mathbb{R}^2)$ of generalized functions (or distributions) of Schwartz, i.e. $$\{ f \in \mathcal{D}'(\mathbb{R}^2) , \ \mathrm{D_x} \mathrm{D_y}f = 0\}.$$
Of course, if $f(x,y)$ (with some abuse of notation because $f$ is not necessary defined pointwise) depends only of $x$ or $y$, it is part of the null space. Also, it is the case for distributions of the form $f(x,y) = h(x) + g(y)$.
Is the converse of that result true?
For two distributions $S\in \mathcal{D}'(\mathbb{R}^k)$ and $T\in \mathcal{D}'(\mathbb{R}^n)$, we define the distribution $S\otimes T \in \mathcal{D}'(\mathbb{R}^{k+n})$ by
$$(S\otimes T)[\varphi] := S\left[x \mapsto T[\varphi(x,\,\cdot\,)]\right].$$
Verification that that is a distribution, and that also $(S\otimes T)[\varphi] = T\left[y \mapsto S[\varphi(\,\cdot\,,y)]\right]$ are left to the reader.
Then we have $\frac{\partial}{\partial x_1} T = 0$ for $T\in \mathcal{D}'(\mathbb{R}^m)$ if and only if there is an $S\in \mathcal{D}'(\mathbb{R}^{m-1})$ with $T = I\otimes S$, where $I$ is the integration, $\varphi\mapsto \int_{\mathbb{R}} \varphi(t)\,dt$. Of course the analogous decomposition holds for distributions whose other partial derivatives vanish, but the notation $S\otimes T$ fits only the cases of the first or last coordinate. Since we look at $\mathbb{R}^2$ here, that suffices, however.
To see that, pick any $\eta\in\mathcal{D}(\mathbb{R})$ with $I[\eta] = 1$, and define $S[\psi] := T[\eta\otimes \psi]$, where $(\eta\otimes\psi)(x,y) = \eta(x)\cdot \psi(y)$. Once again, verification that that defines a distribution is left to the reader. For $\varphi\in \mathcal{D}(\mathbb{R}^m)$, define $\psi(y) = \int_{\mathbb{R}} \varphi(t,y)\,dt$. Then $\psi \in \mathcal{D}(\mathbb{R}^{m-1})$, and since
$$\int_{\mathbb{R}} \varphi(t,y) - \eta(t)\psi(y)\,dt = 0$$
for all $y\in \mathbb{R}^{m-1}$,
$$\Phi(x,y) := \int_{-\infty}^x \varphi(t,y) - \eta(t)\psi(y)\,dt$$
defines an element of $\mathcal{D}(\mathbb{R}^m)$, and we have $\varphi - \eta\otimes\psi = \frac{\partial}{\partial x_1}\Phi$. Therefore
$$\begin{align} (I\otimes S)[\varphi] &= S\left[y\mapsto \int_{\mathbb{R}}\varphi(t,y)\,dt\right]\\ &= T[\eta\otimes\psi]\\ &= T\left[\varphi - \frac{\partial}{\partial x_1}\Phi\right]\\ &= T[\varphi] + \frac{\partial}{\partial x_1}T[\Phi]\\ &= T[\varphi], \end{align}$$
so indeed $T = I\otimes S$.
So, coming to the question, if $\frac{\partial}{\partial x} \frac{\partial}{\partial y} T = 0$, then $\frac{\partial}{\partial y}T = I\otimes S$ for some $S\in \mathcal{D}'(\mathbb{R})$. Now, for $\psi \in \mathcal{D}(\mathbb{R})$, we have
$$S[\psi] = (I\otimes S)[\eta\otimes\psi] = \frac{\partial}{\partial y}T[\eta\otimes\psi] = -T[\eta \otimes \psi'],$$
so $S = R'$, where $R[\psi] = T[\eta\otimes\psi]$.
That means $\frac{\partial}{\partial y} (T - I\otimes R) = 0$, so there is a $P\in \mathcal{D}'(\mathbb{R})$ with $T - I\otimes R = P\otimes I$, or
$$T = I \otimes R + P \otimes I,$$
i.e. every $T$ with $\frac{\partial}{\partial x}\frac{\partial}{\partial y}T = 0$ is of the form $I\otimes R + P\otimes I$, that is, a sum of one distribution depending only on $x$ $(P\otimes I)$ and one distribution depending only on $y$ $(I\otimes R)$.
Yes, the converse of that result is true.