Assume we have a 1-dimensional RBF kernel, $k(x,x')$ with $x,x' \in \mathbb{R}$:
$$k(x,x') = \sigma_f^2 \exp \left(-\frac{(x-x')^2}{2l^2}\right)$$
where $\sigma_f^2$ and $l$ are the hyperparameters, assumed to be constants.
Given $N$ input data, $x \in \mathbb{R}$, and $N$ scalar observation, $y \in \mathbb{R}$, we form the data matrix $D_N = \{ X,Y \}$, where $X \in \mathbb{R}^N$, $Y \in \mathbb{R}^N$. The posterior mean and variance of a Gaussian Process for a test query point $x_q \in \mathbb{R}$ is,
$$\mu(x_q) = k(X, x_q)^{\top} \bar{K}^{-1} Y$$
$$\sigma^2(x_q) = k(x_q, x_q) - k(X, x_q)^{\top} \bar{K}^{-1} k(X, x_q),$$
where $\bar{K} \in \mathbb{R}^{N \times N}$ is the covariance matrix regularized with additive noise.
I am interested in proving that the derivatives of the posterior mean and variance can not be the same, and if they can be the same, under what conditions would they be. The derivatives are given as follows,
$$\frac{\partial \mu(x_q)}{\partial x} = \frac{ \partial k(X, x_q) } {\partial x}^{\top} \bar{K}^{-1} Y$$
$$\frac{\partial \sigma^2(x_q)}{\partial x} = -2 \frac{\partial k(X, x_q)}{\partial x}^{\top} \bar{K}^{-1} k(X, x_q), $$
For them to not be equal, i.e., $ \frac{\partial \mu(x_q)}{\partial x} \neq \frac{\partial \sigma^2(x_q)}{\partial x}$,
$$ \frac{ \partial k(X, x_q) } {\partial x}^{\top} \bar{K}^{-1} Y \neq \frac{\partial k(X, x_q)}{\partial x}^{\top} \bar{K}^{-1} \big( -2 k(X, x_q) \big)$$
Firstly, can it be shown that $Y \neq -2 k(X, x_q) $? If so, is it sufficient to just show this to claim the posterior derivatives for mean and variance are not the same?
The kernel vector will only have non-negative values but the observation vector can have any real value. For the two vectors to be equal, they must be identical. Practically, the chances of that happening are slim. But mathematically, is there a formal way of expressing that?
Can someone point me in the right direction on how to prove they cannot be equal?