I have a strange question, it is possible to consider the derivative as a matrix? (Both are linear transformation technically).
I thought about this example, since $1, x,x^2,...,x^n$ can be thought of as the basis of a vector space, I can consider a polynomial as a vector of its coefficients:
Let $p_n(x)=a_0+a_1 x+a_2 x^2+...+a_n x^n$
Its derivative is $p'_n(x)=\dfrac{\mathrm{d}}{\mathrm{d}x}p_n(x)=a_1+2a_2 x+...+n a_n x^{n-1}$
(and so far nothing new)
But if I consider $p_n(x)$ as a vector of his coefficients: $(a_0,a_1,...,a_n)$ and I want to create a matrix that transforms $p_n(x)$ into $p'_n(x)$ I have:
$$\begin{pmatrix}0&1&0&\cdots&0\\
0&0&2&\cdots&0\\
\vdots&\vdots&\vdots&\ddots&0\\
0&0&0&\cdots&n\\
0&0&0&\cdots&0
\end{pmatrix}\begin{pmatrix}a_0\\ a_1\\ a_2\\ \vdots\\ a_n\end{pmatrix}=\begin{pmatrix}a_1\\ 2a_2\\\vdots\\ n a_n\\ 0\\ \end{pmatrix}$$
So technically $\mathbf{D}=\begin{pmatrix}0&1&0&\cdots&0\\
0&0&2&\cdots&0\\
\vdots&\vdots&\vdots&\ddots&0\\
0&0&0&\cdots&n\\
0&0&0&\cdots&0
\end{pmatrix}$ represents the derivative.
After this I tried to do some operations on it, and it came out that:
- $\mathbf{D}^n$ represents $\dfrac{\mathrm{d}^n}{\mathrm{d}x^n}$
- $\det(\mathbf{D})=0$, this means that $\mathbf{D}^{-1}$ is not defined (I interpreted this fact as the fact that the inverse operation of the derivative is the integral, which in general is not unique since there are infinitely many that vary for arbitrary constants, EVEN IF it is possible to calculate the pseudoinverse and it gives the integral)
- $\text{trace}(\mathbf{D})=0$ (I don't know how to interpret that), idem for $\mathbf{D}^{\top}$
- In general, now I was working on a polynomial, so the dimension of the matrix is $n\times n$ with rank $n-1$, but for a general function I suppose it is $"\infty\times\infty"$ (I don't know if it can be defined the rank for an infinite matrix)
- $\exp(\mathbf{D})$ gives the upper Pascal Matrix
- $\mathbf{D}=\text{diag}(1,1,2,...,n!)^{-1}\begin{pmatrix}0&1&0&\cdots&0\\
0&0&1&\cdots&0\\
\vdots&\vdots&\vdots&\ddots&0\\
0&0&0&\cdots&1\\
0&0&0&\cdots&0
\end{pmatrix}\text{diag}(1,1,2,...,n!)$
Are these things correct? I'm curious to see if it also has applications for partial derivatives or fractional calculus.
(I tried to search on the internet "derivative as a matrix" but the main result was the Jacobian, so tell me if this has a name)
Broadly speaking, yes. The derivative is a linear operator, meaning that $\frac{d}{dx}\left(a f(x) + b g(x)\right) = a \frac{d}{dx} f(x) + b \frac{d}{dx} g(x)$, as long as you are working in a vector space where the vectors are differentiable functions.
If the vector space has finite dimension, then it is isomorphic to either $\mathbb{R}^n$ or $\mathbb{C}^n$ depending on which base field we're using, and we can always relate linear transformations in finite dimensional vector fields to matrices in the corresponding $\mathbb{F}^n$, and we can construct that matrix by mapping a basis of the vector space to the standard basis vectors and looking at how the transformation acts on each element of the basis.
So, for example, the vector space of polynomials of degree $n$ or less has a basis $\{1, x, x^2, \ldots, x^n\}$, and you get the results you've already found. You could similarly look at the space of functions spanned by $\{\cos x, \sin x, e^x, e^{-x}\}$ (which happens to be the space of solutions to $y'''' = y$), in which case you'd find that differentiation looks like:
$\frac{d}{dx} \cong \begin{pmatrix} 0 & 1 & 0 & 0 \\ -1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & -1 \end{pmatrix}$
and notice that this matrix is invertible, because in this particular space there are no constant functions (other than zero).
You can also look at integration as a linear transformation, but it's a little trickier. You can look at specific definite integrals, e.g. $\int_0^1 f(x) dx$, or definite integrals with variable indices, e.g. $\int_0^x f(t) dt$, and consider how different types of functions get mapped (you'll find that the codomain is no longer the same space, so take that into account).
However, when you have an infinite-dimensional vector space (such as "the space of all polynomials", or "the space of all analytic functions"), you lose the idea of being able to map linear transformations to matrices. So while differentiation remains a linear transformation and you can apply a lot of linear algebra theory to it, you can't rely on any results that assume a finite basis.