I have been exploring methods of generalizing the order of derivatives to a broader range of inputs (such as real numbers, complex, and now matrices). We are very well familiar with integer-order derivatives, and so we shall start there. First, let us observe what happens when we consider successive integer-order derivatives of the eigenfunction $e^{kx}$
$$\frac{d}{dx} e^{kx} = ke^{kx}$$ $$\frac{d^{2}}{dx^{2}} e^{kx} = k^{2}e^{kx}$$ $$\vdots$$ $$\frac{d^{n}}{dx^{n}} e^{kx} = k^{n}e^{kx}$$ $$\text{where } n \in \mathbb{Z^{+}}, k \in \mathbb{C}$$
Although this is not a particularly surprising result, what is interesting is that the order $n$ on the left-hand side corresponds to a power $n$ on the right-hand side of the equation. Therefore, if we wish to consider a non-integer order derivative (for example: 1/2), then this implies that $\frac{d^{1/2}}{dx^{1/2}} e^{kx} = k^{1/2}e^{kx}$. A nice property that still remains intact is $\frac{d^{1/2}}{dx^{1/2}} \left( \frac{d^{1/2}}{dx^{1/2}} e^{kx} \right) = \frac{d}{dx} e^{kx} $. In other words: applying two half-order derivatives in succession is the same as taking a single derivative of integer-order (we just add up the orders).
This idea of non-integer order derivatives is highly explored in the area of fractional calculus. However, what is of interest to me is what happens when we consider a matrix-order derivative. That is, is there some notion of differentiating a function a matrix number of times? Of course, this idea sounds absolutely non-sensical. What could it mean to differentiate a function where the order is not necessarily a number, but is instead a linear operator such as a matrix?
I proceeded with this question as follows: take some diagonalizable matrix $A$ such that $A = PDP^{-1}$ (where $P$ is a matrix consisting of the eigenvectors of $A$ as columns, $D$ is a diagonal matrix consisting of the eigenvalues of $A$, and $P^{-1}$ is simply the inverse of $P$). Borrowing from the prior statement that $\frac{d^{n}}{dx^{n}} e^{kx} = k^{n}e^{kx}$, we wish to evaluate when $n$ (the order of our derivative) is now some diagonalizable matrix $A$
$$ \frac{d^{A}}{dx^{A}} e^{kx} = k^{A}e^{kx} $$
It now appears that in order (no pun intended) to take a matrix-order derivative of $e^{kx}$, we must evaluate the matrix power term $k^{A}$ on the right-hand side. My approach to this is by rewriting this into an exponential form and then leveraging the power series representation of the exponential function
$$ k^{A} = e^{ln(k^{A})} = e^{Aln(k)} = \sum_{n=0}^{\infty} \frac{((Aln(k))^{n}}{n!} = \sum_{n=0}^{\infty} \frac{((ln(k))^{n}}{n!} A^{n} = \text{...} $$
Since our requirement was that our matrix $A$ must be diagonalizable in form of $PDP^{-1}$, we know very well from diagonalization in linear algebra that $A^{n} = PD^{n}P^{-1}$ and therefore
$$ \text{...} = \sum_{n=0}^{\infty} \frac{((ln(k))^{n}}{n!} PD^{n}P^{-1} $$
$$ = P \left( \sum_{n=0}^{\infty} \frac{((ln(k))^{n}}{n!} D^{n} \right) P^{-1} $$
$$ = P \left(\sum_{n=0}^{\infty} \frac{((ln(k))^{n}}{n!} \begin{bmatrix} \lambda_{1}^{n}&0&0\\0&\ddots&0\\0&0&\lambda_{j}^{n} \end{bmatrix} \right) P^{-1} $$
$$ = P \begin{bmatrix} \sum_{n=0}^{\infty} \frac{(\lambda_{1}ln(k))^{n}}{n!}&0&0\\0&\ddots&0\\0&0&\sum_{n=0}^{\infty} \frac{(\lambda_{j}ln(k))^{n}}{n!} \end{bmatrix} P^{-1} $$
$$ = P \begin{bmatrix} e^{ln(k)\lambda_{1}}&0&0\\0&\ddots&0\\0&0& e^{ln(k)\lambda_{j}} \end{bmatrix} P^{-1} $$
$$ = P \begin{bmatrix} k^{\lambda_{1}}&0&0\\0&\ddots&0\\0&0& k^{\lambda_{j}} \end{bmatrix} P^{-1} = k^{A} $$
We can now substitute our derived result for $k^{A}$ back into our original matrix-order derivative equation of interest
$$ \frac{d^{A}}{dx^{A}} e^{kx} = k^{A}e^{kx} $$
$$ = P \begin{bmatrix} k^{\lambda_{1}}&0&0\\0&\ddots&0\\0&0& k^{\lambda_{j}} \end{bmatrix} P^{-1} e^{kx} $$
$$ = P \begin{bmatrix} k^{\lambda_{1}}e^{kx}&0&0\\0&\ddots&0\\0&0& k^{\lambda_{j}}e^{kx} \end{bmatrix} P^{-1} $$
It is here where I observed something I consider to be remarkably elegant! Notice how the entries on the main diagonal exactly match the form of $\frac{d^n}{dx^n} e^{kx} = k^{n}e^{kx}$ and thus
$$ \frac{d^{A}}{dx^{A}} e^{kx} = P \begin{bmatrix} \frac{d^{\lambda_1}}{dx^{\lambda_1}}e^{kx}&0&0\\0&\ddots&0\\0&0& \frac{d^{\lambda_j}}{dx^{\lambda_j}}e^{kx} \end{bmatrix} P^{-1} $$
Notice how the matrix-order derivative of $e^{kx}$ contains a diagonal matrix consisting of eigenvalue-order derivatives of $e^{kx}$ on the main diagonal. It is almost as if the matrix-order derivative was "split apart" into a spectrum of eigenvalue-order derivatives! Another interesting thing to note is how a matrix-order derivative results in another matrix, and therefore our domain now consists of vectors as inputs.
I have also done a similar procedure to evaluate the matrix-order derivatives of $sin(x)$ and $cos(x)$. I will exclude the derivation for the sake of the length of this post. It relies on: $\frac{d^n}{dx^n}sin(x) = sin(x+\frac{\pi}{2}n)$ and $\frac{d^n}{dx^n}cos(x) = cos(x+\frac{\pi}{2}n)$; the double-angle identity; and finally the power series representation of $sin(x)$ and $cos(x)$. The matrix-order derivatives for these trigonometric functions are
$$ \frac{d^{A}}{dx^{A}} sin(x) = P \begin{bmatrix} \frac{d^{\lambda_1}}{dx^{\lambda_1}}sin(x)&0&0\\0&\ddots&0\\0&0& \frac{d^{\lambda_j}}{dx^{\lambda_j}}sin(x) \end{bmatrix} P^{-1} $$
$$ \frac{d^{A}}{dx^{A}} cos(x) = P \begin{bmatrix} \frac{d^{\lambda_1}}{dx^{\lambda_1}}cos(x)&0&0\\0&\ddots&0\\0&0& \frac{d^{\lambda_j}}{dx^{\lambda_j}}cos(x) \end{bmatrix} P^{-1} $$
Much like the case with the matrix-order derivative of $e^{kx}$, the matrix-order derivative of $sin(x)$ and $cos(x)$ consists of the eigenvalue-order derivatives on their respective diagonal matrices.
However, I am still left wondering: what exactly does this mean? What are some possible applications or reasons to consider a matrix-order derivative besides just mere generalization? How does this fit with our notion of derivatives mapping a continuous $C^{n}$ function to a $C^{n-1}$ function? Does this still even make sense, or this just an abuse of notation? Does this provide a new insight into the functionality of the derivative operator?