For an image $I$, its first order derivatives can be computed using several oprators, such as $$K_{sobel} = \left[ \begin{array}{ccc} -1 &0 &1 \\ -2 &0 &2 \\ -1 &0 &1 \end{array}\right]$$ $$I_{x} = I * K_{sobel},\ I_{y} = I * K_{sobel}.T$$. And to compute the second order derivatives $(I_{xx}, I_{yy}, I_{xy})$, I can apply $K_{sobel}$ twice on $I$, however, I wonder are there any kernels which can be applied once to get the second order derivatives.
Here is what I tried, I made up 2 kernels, $$K_{xx} = \left[ \begin{array}{ccc} 0 &0 &0 \\ 2 &-4 &2 \\ 0 &0 &0 \end{array}\right]$$ $$K_{yy} = \left[ \begin{array}{ccc} 0 &2 &0 \\ 0 &-4 &0 \\ 0 &2 &0 \end{array}\right]$$ and $I_{xx} = I * K_{xx},\ I_{yy} = I * K_{yy}$, they look pretty much the same as twice-applied-sobel results, but I failed to find a good kernel for $I_{xy}$, I need your help, any idea is appreciated.
Convolution ($\star$) is associative, so if $I_{xx}=(I\star K_{\text{sobel}})\star K_{\text{sobel}}$, then $I_{xx}=I\star (K_{\text{sobel}}\star K_{\text{sobel}})$. This would mean that to get the kernel for $I_{xx}$ you just apply $K_{\text{sobel}}$ to itself. Practically, this means making a tiny image $I_K$ out of $K_{\text{sobel}}$, padding it with two rows of zeros all around, and then applying $K_{\text{sobel}}$ to $I_K$.
To get $I_{xy}$ you would just apply $K_{\text{sobel}}^T$ to $I_K$.
EDIT: Another thing that you can try is the discrete Laplace kernel:
$$\begin{pmatrix} 0 & 1 & 0 \\ 1 & -4 & 1 \\ 0 & 1 & 0 \end{pmatrix}$$
The one-d version looks basically like your kernels that you came up with:
$$\begin{pmatrix} 1 & -2 & 1 \\ \end{pmatrix}$$