Consider the Lie algebra of all $n$ by $n$ matrices $\mathfrak{gl}(n,R)$ over arbitrary commutative ring $R$, generated (as an $R$-module) by elementary matrices $e_{ij}$ that have a $1$ on the intersection of the $i$-th row and $j$-th column, and $0$ everywhere else.
$\mathfrak{gl}(n,R)$ has a representation on the $R$-module $R[X_1,\dots,X_n]$ of polynomials in $n$ commuting variables, defined as
$$e_{ij} \mapsto X_i \partial_j$$
, where $\partial_j$ is the $R$-linear operator of formal differentiation by the variable $X_j$. The fact that it really is a Lie algebra representation can be easily proved by computing the Lie brackets of the basis and it's image:
$$[e_{ij}, e_{kl}] = e_{ij}e_{kl} - e_{kl}e_{ij} = \delta_{jk}e_{il} - \delta_{li}e_{kj}$$ $$[X_i\partial_j, X_k\partial_l] = X_i\partial_j (X_k\cdot\partial_l) - X_k\partial_l (X_i\cdot\partial_j) = \\ = X_i\Big(\partial_j (X_k)\partial_l+X_k\partial_j\partial_l\Big)-X_k\big(\partial_l(X_i)\partial_j + X_i\partial_l\partial_j\big) = \\ = \delta_{jk}X_i\partial_l - \delta_{li}X_k\partial_j$$
Here we used the multiplication rules for $e_{ij}$, commutation relations among $X_i$ and $\partial_i$, and the product rule.
This proof works, but does not look very nice due to (relatively) heavy computation and index machinery.
Another possible proof, that works in cases $R = \mathbb{R}$ and $R = \mathbb{C}$, is to consider the representation of the group $GL(n,R)$ of invertivle $n$-by-$n$ matrices on the space of polynomials (equivalently, on the symmetric algebra of the space $R^n$). Then, differentiating, we see that the generators of the Lie algebra $\mathfrak{gl}(n,R)$ act by exactly the formulas above, which immediately proves that this $\mathfrak{gl}(n,R)$ representation is actually a representation, as it is induced by a Lie group representation.
Although this proof sounds more deep and explanatory, I am feeling disgusted that it works only over manifold-theoretic fields ($\mathbb{R}$ and $\mathbb{C}$), while the result is valid over any commutative ring.
So, my question is: is there a more abstract, deep, clean and satisfactory proof that the above mapping is a Lie algebra representation, that works over arbitrary commutative ring?
To phrase it a bit differently, is there some part of the structure (like, maybe, something in the symmetric algebra induced representation) that makes the claim more obvious?
Note that a representation of $\mathfrak{gl}(n, R)$ on $R[x_1, \ldots, x_n]$, is the same as a homomorphism of Lie algebras $\varphi: \mathfrak{gl}(n, R) \to \mathrm{End}(R[x_1, \ldots, x_n])$, where the endomorphism algebra is given the commutator bracket. The partial derivative operation does not live inside the polynomial ring. As discussed in the comments, the image of the homomorphism must live in the Weyl Algebra, but that is not important to know.
All of the above can be done in a coordinate-free manner, with very natural constructions. I'll work over a field $k$ and a $k$-vector space $V$, though I think most or all of my answer applies if you instead take $k$ to be a commutative ring, and $V$ to be any $k$-module.
The coordinate-free analogue of $\mathfrak{gl}(n, k)$ is the Lie algebra $\mathfrak{gl}(V)$, of all linear maps $V \to V$ with the commutator bracket. The coordinate-free analogue of the polynomial algebra $k[x_1, \ldots, x_n]$ is the symmetric algebra $\mathrm{Sym}(V^*)$. However, things will be easier if I replace this with the very similar space $\mathrm{Sym}(V)$ (these are isomorphic when $V$ is finite-dimensional).
Each element of $\mathrm{Sym}(V)$ is built from sums of "products" of vectors, for example $v_1 v_2 + 2 v_3 + 5$, where each $v_i \in V$. You should think of each of these terms as quadratic, linear, and constant respectively. You are allowed to commute vectors any way you like.
The next thing we need is some differential operator $\partial$ acting on $\mathrm{Sym}(V)$. It should be parametrised by some "direction", so we would like to take any linear functional $f \in V^*$ and associate to it the operator $\partial_f$ acting on $\mathrm{Sym}(V)$. We can list a few properties such a "derivative" should satisfy:
In fact, properties 2 and 3 are superfluous: all you really need are linearity and the Leibniz rule. Furthermore, the Leibniz rule means that we can always write $$\partial_f(v_1 \cdots v_n) = \sum_i v_1 \cdots v_{i-1} \partial_f(v_i) v_{i+1} \cdots v_n$$ and so it is enough to specify how $\partial_f$ acts on linear terms in $\mathrm{Sym}(V)$, and extend by linearity and the Leibniz rule. So, define $\partial_f(v) := f(v)$ for linear terms $v \in \mathrm{Sym}(V)$, and extend appropriately to the whole space.
Now we have a notion of "polynomial algebra" and "directional derivative", and all we have used to build them was the same vector space $V$ we used to define $\mathfrak{gl}(V)$.
What is the Lie algebra homomorphism $\varphi: \mathfrak{gl}(V) \to \mathfrak{gl}(\mathrm{Sym}(V))$ in this language? It is enough to specify the action of $\varphi$ on a basis, so consider the pure tensor $v \otimes f \in \mathfrak{gl}(V)$, where $v \in V$ and $f \in V^*$. There is not all that much choice of where we want to send it, so define $$\varphi(v \otimes f) = v \partial_f.$$ Now, we are also free to check the Lie brackets only on pure tensors, so take $v \otimes f, u \otimes g \in \mathfrak{gl}(V)$ and compute $$ \begin{aligned} \varphi([v \otimes f, u \otimes g]) &= \varphi((v \otimes f)(u \otimes g) - (u \otimes g)(v \otimes f)) \\ &= \varphi(f(u) v \otimes g - g(v) u \otimes f) \\ &= \partial_f(u) v \partial_g - \partial_g(v) u \partial_f \end{aligned} $$
On the other hand, we have $$ \begin{aligned} \quad [\varphi(v \otimes f), \varphi(u \otimes g)] &= [v \partial_f, u \partial_g] \\ &= v \partial_f u \partial_g - u \partial_g v \partial_f \\ &= v (\partial_f(u) + u \partial_f) \partial_g - u(\partial_g(v) + v \partial_g) \partial_f \end{aligned} $$ where going from the second-last to last line, I have applied the Leibniz rule. Finally, argue that $\partial_f$ and $\partial_g$ commute, and conclude that $\varphi$ is a homomorphism of Lie algebras.