First of all, let me give a basic definition and the problem that we want to solve [both taken from Armin Iske's Book "Approximation Theory and Algorithms for Data Analysis"]:
Problem 8.1. On given interpolation points $X = \{x_1, \dots, x_n\}\subset \Omega$, where $\Omega\subset \mathbb R^{d}$ for $d>1$, and function values $f_X\in \mathbb R^{n}$ find an interpolant $s\in \mathcal C(\Omega)$ satisfying $s_X = f_X$, so that $s$ satisfies the interpolation conditions $$s(x_j) = f(x_j) \quad \text{for all $1\leq j\leq n$}.$$
Definition 8.2. [taken from Armin Iske's Book "Approximation Theory and Algorithms for Data Analysis"]:
A continuous and symmetric function $K: \mathbb R^{d}\times \mathbb R^{d} \rightarrow \mathbb R$ is said to be positive definite on $\mathbb R^{d}$, $K\in \textbf{PD}_d$, if for any set of pairwise distinct interpolation points $X = \left\{ x_1, \dots, x_n\right\} \subset \mathbb R^{d}$, $n\in \mathbb N$, the matrix $$A_{K, X} = \left( K\left(x_k, x_j\right) \right)_{1\leq j, k\leq n}\in\mathbb R^{n\times n}$$ is symmetric and positive definite.
Now, here comes the problem that I want to solve:
Let $\Phi\left(x-y\right) = K(x,y)$ be positive definite, $K\in \textbf{PD}_d$, where $\Phi:\mathbb R^{d}\rightarrow \mathbb R$ is even and satisfies, for $\alpha > 0$, the growth condition $$\left\vert \Phi\left(0\right) - \Phi\left( x\right) \right\vert \leq C \left\vert\left\vert x\right\vert\right\vert_{2}^{\gamma} \quad \forall x\in B_{R}\left( 0\right)$$ on some ball $B_{R}\left( 0\right)$ around $0$ with radius $R > 0$ and constant $C > 0$. Prove that no positive definite kernel $K\in \textbf{PD}_d$ satisfies the growth condition for $\gamma > 2$.
This is a homework we got in a class.
EDIT: This is the definition of (total) differentiability I know:
Let $U\subseteq \mathbb R^n$ be open and $F: U\rightarrow \mathbb R^m$.
(i) The function $F$ is called (totally) differentiable if there exists a linear map $A: \mathbb R^n\rightarrow\mathbb R^m$ such that $$\lim_{\xi\in\mathbb R^n\backslash\{0\},\ \xi\rightarrow 0}\frac{\|F(x+\xi)-F(x)-A(\xi)\|}{\|\xi\|} = 0$$
(ii) The function $F: U\rightarrow\mathbb R^m$ is called differentiable if $F$ is at all points $x\in U$ differentiable.
For every symmetric positive definite kernel function $K$ on a set $X$, there exists a unique Hilbert space $\mathcal{H}$ of functions on $X$ for which $K$ is a reproducing kernel. $\mathcal{H}$ is called the native reproducing kernel Hilbert space of K or native space in short. More in Wiki article in particular Moore-Aronszajn theorem.
The idea is to first show the corresponding (Hölder) condition for functions from the native space $\mathcal{H}$, apply mean value theorem to show that they are constant for $\alpha > 2$ and conclude that $K$ is not positive definite in this case.
Let $\mathcal{H}$ be the Hilbert space of functions $f: X \rightarrow \mathbb{R}$ with associated inner product $\langle \cdot, \cdot \rangle_{\mathcal{H}}$, then $$ \langle K(\cdot, x), f \rangle_{H} = f(x) \qquad \text{for all $f \in \mathcal{H}, x \in X$} $$
First, suppose that $\Phi(x - y) = K(x, y)$ satisfy the condition
\begin{equation} \newcommand{\norm}[1]{\left\lVert#1\right\rVert} |\Phi(0) - \Phi(x)| \leq C \norm{x}_{2}^{\alpha} \qquad \text{for all } x \in B_{r}(0) \tag{1} \label{condition} \end{equation}
And let the kernel be normalized $$ \Phi(0) = 1 $$
Then, for every $f \in \mathcal{H}$, we have that \begin{align*} |f(x) - f(y)|^{2} &= |\langle f, \Phi(\cdot - x) - \Phi(\cdot - y) \rangle_{K}|^{2} \\ \tag{By Cauchy–Schwarz} &\leq \norm{f}_{K}^{2} \norm{\Phi(\cdot - x) - \Phi(\cdot - y)}_{K}^{2} \\ &= \norm{f}_{K}^{2} \left( \norm{\Phi(\cdot - x)}_{K}^{2} - 2 \langle \Phi(\cdot - x), \Phi(\cdot - y) \rangle_{K} + \norm{\Phi(\cdot - y)}_{K}^{2} \right) \\ \tag{Using the normalization $\Phi(0) = 1$} &= \norm{f}_{K}^{2} \left( 1 - 2 \Phi(x - y) + 1 \right)\\ &= \norm{f}_{K}^{2} 2 \left(\Phi(0) - \Phi(x - y) \right) \\ \tag{Using Condition \eqref{condition}} &\leq \norm{f}_{K}^{2} 2 C \norm{x - y}_{2}^{\alpha} \end{align*}
Meaning that every $f \in \mathcal{H}$ is globally Hölder continuous $$ |f(x) - f(y)| \leq D \norm{x - y}_{2}^{\alpha/2} $$ with the constant $D = \norm{f}_{K}^{2} 2 C$ and exponent $\alpha/2$.
Second, we show that for $\alpha > 2$, the native space $\mathcal{H}$ contains only constant functions $f$.
If the exponent $\alpha/2 > 1$, we can write $$ \frac{|f(x) - f(y)|}{\norm{x - y}_{2}} \leq D \norm{x - y}_{2}^{\alpha/2 - 1} $$
and taking the limit as $y \rightarrow x$ $$ \lim_{y \rightarrow x} \frac{|f(x) - f(y)|}{\norm{x - y}_{2}} \leq \lim_{y \rightarrow x} D \norm{x - y}_{2}^{\alpha/2 - 1} = 0 $$
Update: Recall that $f: X \rightarrow \mathbb{R}$ is differentiable at $x \in X$ if there exists a bounded linear operator $f'(x): X \rightarrow \mathbb{R}$ with
$$ \lim_{h \rightarrow 0} \frac{|f(x + h) - f(x) - f'(x) h|}{\norm{h}_{2}} = 0 $$
for $h \in X$. In our case, for $h = y - x \in X$, we have that $$ 0 \leq \lim_{h \rightarrow 0} \frac{|f(x + h) - f(x)|}{\norm{h}_{2}} \leq \lim_{h \rightarrow 0} D \norm{h}_{2}^{\alpha/2 - 1} = 0 $$ which shows that $f$ is differentiable at $x \in X$ with derivative $f'(x) \equiv 0$ for each $x \in X$.
Now let $X = \mathbb{R}^{d}$. The derivative $f'(x)$ is an element of dual space $X^{*}$ of Hilbert space $X$. Using Riesz isomorphism $R: X^{*} \rightarrow X$, we can define the gradient $\nabla f(x) = R f'(x)$ as an element of $X$. Then $$ \langle f'(x), h \rangle_{X^{*}, X} = \nabla f(x)^{T} \cdot h $$ where $\cdot$ is the inner product in $X$. Since $\nabla f(x) = 0 \in \mathbb{R}^{d}$, every partial derivative of $f$ is 0, it follows by mean value theorem, that $f$ is a constant function.
Finally, the associated kernel $K(\cdot, \cdot)$ is then also a constant function \begin{align*} \langle K(\cdot, x) - K(\cdot, y), f \rangle_{\mathcal{H}} &= \langle K(\cdot, x), f \rangle_{\mathcal{H}} - \langle K(\cdot, y), f \rangle_{\mathcal{H}} \\ &= f(x) - f(y) = 0 \implies K(\cdot, x) = K(\cdot, y) \quad \text{for all $x, y \in X$} \end{align*}
and thus not positive definite. The determinant of interpolation matrix is $\text{det} \left( K(x_{i}, x_{j}) \right)_{i,j} = 0$.
We can conclude that no positive definite kernel $K$ satisfies the Condition \eqref{condition} for $\alpha > 2$.